MARKOV-MODELS IN MEDICAL DECISION-MAKING - A PRACTICAL GUIDE

Citation
Fa. Sonnenberg et Jr. Beck, MARKOV-MODELS IN MEDICAL DECISION-MAKING - A PRACTICAL GUIDE, Medical decision making, 13(4), 1993, pp. 322-338
Citations number
20
Categorie Soggetti
Medicine Miscellaneus
Journal title
ISSN journal
0272989X
Volume
13
Issue
4
Year of publication
1993
Pages
322 - 338
Database
ISI
SICI code
0272-989X(1993)13:4<322:MIMD-A>2.0.ZU;2-E
Abstract
Markov models are useful when a decision problem involves risk that is continuous over time, when the timing of events is important, and whe n important events may happen more than once. Representing such clinic al settings with conventional decision trees is difficult and may requ ire unrealistic simplifying assumptions. Markov models assume that a p atient is always in one of a finite number of discrete health states, called Markov states. All events are represented as transitions from o ne state to another. A Markov model may be evaluated by matrix algebra , as a cohort simulation, or as a Monte Carlo simulation. A newer repr esentation of Markov models, the Markov-cycle tree, uses a tree repres entation of clinical events and may be evaluated either as a cohort si mulation or as a Monte Carlo simulation. The ability of the Markov mod el to represent repetitive events and the time dependence of both prob abilities and utilities allows for more accurate representation of cli nical settings that involve these issues.