Two major reasons for the popularity of the EM algorithm are that its
maximum step involves only complete-data maximum likelihood estimation
, which is often computationally simple, and that its convergence is s
table, with each iteration increasing the likelihood. When the associa
ted complete-data maximum likelihood estimation itself is complicated,
EM is less attractive because the M-step is computationally unattract
ive. In many cases, however, complete-data maximum likelihood estimati
on is relatively simple when conditional on some function of the param
eters being estimated. We introduce a class of generalized EM algorith
ms, which we call the ECM algorithm, for Expectation/Conditional Maxim
ization (CM), that takes advantage of the simplicity of complete-data
conditional maximum likelihood estimation by replacing a complicated M
-step of EM with several computationally simpler CM-steps. We show tha
t the ECM algorithm shares all the appealing convergence properties Of
EM, such as always increasing the likelihood, and present several ill
ustrative examples.