Jpr. Herrman et al., INTEROBSERVER AND INTRAOBSERVER VARIABILITY IN THE QUALITATIVE CATEGORIZATION OF CORONARY ANGIOGRAMS, International journal of cardiac imaging, 12(1), 1996, pp. 21-30
Citations number
16
Categorie Soggetti
Cardiac & Cardiovascular System","Radiology,Nuclear Medicine & Medical Imaging
The ABC classification of the American College of Cardiology and the A
merican Heart Association is a commonly used categorization to estimat
e the risk and success of intracoronary intervention, as well as the p
robability of restenosis. To evaluate the reliability of qualitative a
ngiogram readings, we randomly selected 200 films from single lesion a
ngioplasty procedures. A repeated visual assessment (greater than or e
qual to 2 months interval) by two independent observers resulted in ka
ppa values of inter and intra-observer variability for the ABC lesion
classification and for all separate items that compile it. Variability
in assessment is expressed in percentage of total agreement, and in k
appa value, which is a parameter of the agreement between two or more
observations in excess of the chance agreement. Percentage of total ag
reement and kappa value was 67.8% and 0.33 respectively for the ABC cl
assification, indicating a poor agreement. Probably this is due to the
deficiency of strict definitions. Further investigation has to demons
trate whether improvement can be achieved using complete and detailed
definitions without ambiguity, and consensus after panel assessment.