Evaluation of medical web sites - Interobserver and intraobserver reliability of an evaluation tool

Citation
P. Fremont et al., Evaluation of medical web sites - Interobserver and intraobserver reliability of an evaluation tool, CAN FAM PHY, 47, 2001, pp. 2270-2278
Citations number
7
Categorie Soggetti
General & Internal Medicine
Journal title
CANADIAN FAMILY PHYSICIAN
ISSN journal
0008350X → ACNP
Volume
47
Year of publication
2001
Pages
2270 - 2278
Database
ISI
SICI code
0008-350X(200111)47:<2270:EOMWS->2.0.ZU;2-2
Abstract
Objective To develop and test the reliability of a tool for rating websites that provide information on evidence-based medicine. Design For each site, 60% of the score was given for content (eight criteri a) and 40% was given for organization and presentation (nine criteria). Fiv e of 10 randomly selected sites met the inclusion criteria and were used by three observers to test the accuracy of the tool. Each site was rated twic e by each observer, with a 3-week interval between ratings. Setting Laval University, Quebec city. Participants Three observers. Main outcome measures The intraclass correlation coefficient (ICC) was used to rate the reliability of the tool. Results Average overall scores for the five sites were 40%, 79%, 83%, 88%, and 89%. All three observers rated the same two sites in fourth and fifth p lace and gave the top three ratings to the other three sites. The overall r ating of the five sites by the three observers yielded an ICC of 0.93 to 0. 97. An ICC of 0.87 was obtained for the two overall ratings conducted 3 wee ks apart. Conclusion This new tool offers excellent intraobserver and interobserver m easurement reliability and is an excellent means of distinguishing between medical websites of varying quality, For best results, we recommend that th e tool be used simultaneously by two observers and that differences be reso lved by consensus.