Twenty-five strategies for improving the design, implementation and analysis of health services research related to alcohol and other drug abuse treatment

Citation
Ml. Dennis et al., Twenty-five strategies for improving the design, implementation and analysis of health services research related to alcohol and other drug abuse treatment, ADDICTION, 95, 2000, pp. S281-S308
Citations number
118
Categorie Soggetti
Public Health & Health Care Science","Clinical Psycology & Psychiatry
Journal title
ADDICTION
ISSN journal
09652140 → ACNP
Volume
95
Year of publication
2000
Supplement
3
Pages
S281 - S308
Database
ISI
SICI code
0965-2140(200011)95:<S281:TSFITD>2.0.ZU;2-4
Abstract
While some aspects of addiction can be studied in laboratory or controlled settings, the study of long-term recovery management and the health service s that support it requires going our into the community and dealing with po pulations and systems that are much more diverse and less under our control . This in turn raises many methodological challenges for the health service researchers studying alcohol and other drug abuse treatment. This paper id entifies some of these challenges related to the design, measurement, imple mentation and effectiveness of health services research. It then recommends 25 strategies (and key primers) for addressing them: (1) identifying in ad vance all stakeholders and issues; (2) developing conceptual models of inte rvention and context; (3) identifying the population to whom the conclusion s will be generalized; (4) matching the research design to the question; (5 ) conducting randomized experiments only when appropriate and necessary; (6 ) balancing methodological and treatment concerns; (7) prioritizing analysi s plans and increasing design sensitivity, (8) combining qualitative and qu antitative methods; (9) identifying the four basic types of measures needed ; (10) identifying and using standardized measures; (11) carefully balancin g measurement selection and modification; (12) developing and evaluating mo dified and new measures when necessary; (13) identifying and tracking major clinical subgroups; (14) measuring and analyzing the actual pattern of ser vices received; (15) incorporating implementation checks into the design; ( 16) incorporating baseline measures into the intervention; (17) monitoring implementation and dosage as a form of quality assurance; (18) developing p rocedures early to facilitate tracking and follow-up of study participants; (19) using more appropriate representations of the actual experiment; (20) using appropriate and sensitive standard deviation terms; (21) partialing out variance due to design or known sources prior to estimating experimenta l effect sizes; (22) using dimensional, interval and ratio measures to incr ease sensitivity to change; (23) using path or structural equation models; (24) integrating qualitative and quantitative analysis into reponing; and ( 25) using quasi-experiments, economic or organizational studies to answer o ther likely policy questions. Most of these strategies have been tried and tested in this and other areas, bur are not widely used. Improving the stat e of the art of health services research and bridging the gap between resea rch and practice do not depend upon using the most advanced methods, but ra ther upon using the most appropriate methods.