Purpose. To evaluate the reliability, efficiency, and cost of administering
open-ended test questions by computer.
Methods. A total of 1,194 students in groups of approximately 30 were teste
d at the end of a required surgical clerkship from 1993 through 1998. For t
he academic years 1993-94 and 1994-95, the administration of open-ended tes
t questions by computer was compared experimentally with administration by
paper-and,pencil for two years. The paper-and-pencil mode of the test was d
iscontinued in 1995, and the administration of the test by computer was eva
luated for all students through 1998. Computerized item analysis of respons
es was added to the students' post-examination review session in 1996.
Results. There was no significant difference in the performances of 440 stu
dents (1993-94 and 1994-95) on the different modes of test administration.
Alpha reliability estimates were comparable. Most students preferred the co
mputer administration, which the faculty judged to be efficient and cost-ef
fective. The immediate availability of item-analysis data strengthened the
post-examination review sessions.
Conclusion. Routine administration of open-ended test questions by computer
is practical, and it enables faculty to provide feedback to students immed
iately after the examination.