Background: An OSCE was used to measure the ability of a cohort of res
idents to manage oncologic problems. Methods: Nine oncologic clinical
problems were presented to 56 surgical residents. Each problem contain
ed a 5-minute data-gathering period (DGP) and a 5-minute data-interpre
tation period (DIP). A performance score was determined for each resid
ent for each problem. Reliability was estimated by coefficient alpha;
validity, by the construct of experience, Wilks's lambda criterion was
used to determine whether training level could be identified by OSCE
performance. Results: The DGP reliability was 0.80; the DIP: 0.49. Sen
ior residents performed significantly better than junior residents (P
= 0.0001), who performed significantly better than interns (P = 0.0009
), Of the residents, 62% were competent on the DGP, but only 21% on th
e DIP. Important deficits in knowledge and clinical skills were-appare
nt at all levels of training. Conclusion: The education and evaluation
of residents in oncology need improvement. (C) 1997 Wiley-Liss, Inc.