Background: A challenge for Problem-Based Leaming (PBL) schools is to intro
duce reliable, valid, and cost-effective testing methods into the curriculu
m in such a way as to maximize the potential benefits of PBL while avoiding
problems associated with assessment techniques like multiple-choice questi
on, or MCQ tests.
Purpose: We document the continued development of an exam that was designed
to satisfy the demands of both PBL and the scientific principles of measur
ement.
Methods: A total of 102 medical students wrote a clinical reasoning exercis
e (CRE) as a requirement for two consecutive units of instruction. Each CRE
consisted of a series of 18 short clinical problems designed to assess a s
tudent's knowledge of the mechanism of diseases that were covered in three
subunits located within each unit. Responses were scored by a student's tut
or and a 2nd crossover tutor.
Results: Generalizability coefficients for raters, subunits, and individual
problems were low, but the reliability of the overall test scores and the
reliability of the scores across 2 units of instruction were high. Subseque
nt analyses found that the crossover tutor's ratings were lower than the ra
tings provided by one's own tutor and the CRE correlated with the biology c
omponent of a progress test.
Conclusion: The magnitude of the generalizability coefficients demonstrates
that the CRE is capable of detecting differences in reasoning across knowl
edge domains and is therefore a useful evaluation tool.