Jjm. Jansen et al., PERFORMANCE-BASED ASSESSMENT IN CONTINUING MEDICAL-EDUCATION FOR GENERAL-PRACTITIONERS - CONSTRUCT-VALIDITY, Medical education, 30(5), 1996, pp. 339-344
The use of performance-based assessment has been extended to postgradu
ate education and practising doctors, despite criticism of validity. W
hile differences in expertise at this level are easily reflected in sc
ores on a written test, these differences are relatively small on perf
ormance-based tests. However, scores on written tests and performance-
based tests of clinical competence generally show moderate correlation
s. A study was designed to evaluate construct validity of a performanc
e-based test for technical clinical skills in continuing medical educa
tion for general practitioners, and to explore the correlation between
performance and knowledge of specific skills. A I-day skills training
was given to 71 general practitioners, covering four different techni
cal clinical skills. The effect of the training on performance was mea
sured with a performance-based test using a randomized controlled tria
l design, while the effect on knowledge was measured with a written te
st administered 1 month before and directly after the training. A trai
ning effect could be shown by the performance-based test for all four
clinical skills. The written test also demonstrated a training effect
for all but one skill. However, correlations between scores on the wri
tten test and on the performance-based test were low for all skills. I
t is concluded that construct validity of a performance-based test for
technical clinical skills of general practitioners was demonstrated,
while the knowledge test score was shown to be a poor predictor of com
petence for specific technical skills.