In 2006, I transitioned a manually-graded lab-based exam to one that combined hands-on activities with multiple choice questions. Given the limitations of technology, I had to be a little creative. My blog post for March chronicles how I visualized the set up and a few lessons learned from the experience.
After a large enterprise organization acquired the business technology optimization company I was working for, our education services team at that time was faced with how to integrate our manually-graded lab-based certification exams to written exams developed to standards and delivered through brick and mortar test centers by the enterprise company’s mature and well-oiled credential program.
Our certification program offered a lab-based exam where a candidate worked on activities and saved files in a virtual image, then answered questions (short text, multiple choice, fill-in-blank) in a test booklet. An expert would be assigned to check the code and files in the virtual image as well the answers in the booklet against an answer key.
The process worked for a few thousand candidates annually, but scaling was difficult, especially as we had become part of one of the largest technology organizations in the world.
I participated in meetings about transitioning our exams to written tests. Members of the credential program offered their support and resources to help migrate data and exam content. Still, there was some hesitation on the higher level certification. Management wanted to keep some hands-on portions of the assessment.
As our teams prepared for the annual software universe conference, I listened to one credential expert explain that for off-site events, they used internet-based testing with a proctor in the conference testing rooms. As soon as I heard that, it seemed like pieces of a puzzle fit together in my mind.
Would a candidate be able to use the virtual environment to do the hands-on and then answer questions hosted in the internet-based test application so the exam candidate’s data are captured in the exam delivery platform?
Yes. Candidates were able to run both the internet-based testing application and a virtual environment during the exam. This format was used for several years by the software arm of the enterprise company.
Some lessons learned:
In terms of scaling, there was improvement but we still experienced limitations due to available capacity of virtual environments. It was exacerbated when coupled with needed capacity for scheduled training classes.
The design of the performance-based activities remained the same. Candidates were asked to set up and configure, write code and automate requirements using the software. The change occurred in the objective and item types. Instead of asking for a specific method to arrive at an answer, I formulated questions to focus on specific outcomes. For example, instead of asking how a candidate coded to capture content in a variable, the question asked for a specific captured value.
It's been fifteen years since. Nowadays, I hear of companies who provide candidates with on-demand virtual environments they can schedule and take their test remotely with a live proctor. That sounds cool (nerd alert!) and a long way from what I had to work with. Given the technology today, it would be interesting to design both, one full performance-based test automatically graded from the actions of a candidate on the virtual environment, and one where performance is paired with an objective test and graded from the responses on exam questions.
Image attribution through Creative Commons license: Dan Hersham - blank tab in Chrome, University of Wisconsin - Lisp programming, Emblem person icon
#exam #design #engineer #creativity #idea #technology #certification #learning #innovate #resourcefulness
Comments