PERFORMANCES OF 27 MEDLINE SYSTEMS TESTED BY SEARCHES WITH CLINICAL QUESTIONS

Citation
Rb. Haynes et al., PERFORMANCES OF 27 MEDLINE SYSTEMS TESTED BY SEARCHES WITH CLINICAL QUESTIONS, Journal of the American Medical Informatics Association, 1(3), 1994, pp. 285-295
Citations number
12
Categorie Soggetti
Information Science & Library Science","Medicine Miscellaneus","Computer Science Information Systems
ISSN journal
10675027
Volume
1
Issue
3
Year of publication
1994
Pages
285 - 295
Database
ISI
SICI code
1067-5027(1994)1:3<285:PO2MST>2.0.ZU;2-N
Abstract
Objective: To compare the performances of online and compact-disc (CD- ROM) versions of the National Library of Medicine's (NLM) MEDLINE data base. Design: Analytic survey. Setting: Health Information Research Un it, McMaster University, Hamilton, Ontario, Canada. Intervention: Clin ical questions were drawn from 18 searches originally conducted sponta neously by clinicians from wards and clinics who had used Grateful Med Version 4.0. Clinicians' search strategies were translated to meet th e specific requirements of 13 online and 14 CD-ROM MEDLINE systems. A senior librarian and vendors' representatives constructed independent searches from the clinicians' questions. The librarian and clinician s earches were run through each system, in command mode for the libraria n and menu mode for clinicians, when available. Vendor searches were r un through the vendors' own systems only. Main Measurements: Numbers o f relevant and irrelevant citations retrieved, cost (for online system s only), and time. Results: Systems varied substantially for all searc hes, and for librarian and clinician searches separately, with respect to the numbers of relevant and irrelevant citations retrieved (p < 0. 001 for both) and the cost per relevant citation (p = 0.012), but not with respect to the time per search. Based on combined rankings for th e highest number of relevant and the lowest number of irrelevant citat ions retrieved, the SilverPlatter CD-ROM MEDLINE clinical journal subs et performed best for librarian searches, while the PaperChase online system worked best for clinician searches. For cost per relevant citat ion retrieved, Dialog's Knowledge Index performed best for both librar ian and clinician searches. Conclusions: There were substantial differ ences in the performances of competing MEDLINE systems, and performanc e was affected by search strategy, which was conceived by a librarian or by clinicians.