Interpretation of emergency department radiographs: A comparison of emergency medicine physicians with radiologists, residents with faculty, and filmwith digital display
J. Eng et al., Interpretation of emergency department radiographs: A comparison of emergency medicine physicians with radiologists, residents with faculty, and filmwith digital display, AM J ROENTG, 175(5), 2000, pp. 1233-1238
Citations number
18
Categorie Soggetti
Radiology ,Nuclear Medicine & Imaging","Medical Research Diagnosis & Treatment
OBJECTIVE. We determined the relative value of teleradiology and radiology
resident coverage of the emergency department by measuring and comparing th
e effects of physician specialty, training level, and image display method
on accuracy of radiograph interpretation.
MATERIALS AND METHODS. A sample of four faculty emergency medicine physicia
ns, four emergency medicine residents, four faculty radiologists, and four
radiology residents participated in our study. Each physician interpreted 1
20 radiographs, approximately half containing a clinically important index
finding. Radiographs were interpreted using the original films and high-res
olution digital monitors. Accuracy of radiograph interpretation was measure
d as the area under the physicians' receiver operating characteristic (ROC)
curves,
RESULTS. The area under the ROC curve was 0.15 (95% confidence interval [CI
], 0.10-0.20) greater for radiologists than for emergency medicine physicia
ns, 0.07 (95% CI, 0.02-0.12) greater for faculty than for residents, and 0.
07 (95% CI, 0.02-0.12) greater for films than for video monitors. Using the
se results, we estimated that teleradiology coverage by faculty radiologist
s would add 0.09 (95% CI, 0.03-0.15) to the area under the ROC curve for ra
diograph interpretation by emergency medicine faculty alone, and radiology
resident coverage would add 0.08 (95% CI, 0.02-0.14) to this area.
CONCLUSION. We observed significant differences between the interpretation
of radiographs on film and on digital monitors. However, we observed differ
ences of equal or greater magnitude associated with the training level and
physician specialty of each observer. In evaluating teleradiology services,
observer characteristics must be considered in addition to the quality of
image display.