Computing posterior modes (e.g., maximum likelihood estimates) for models i
nvolving latent variables or missing data often involves complicated optimi
zation procedures. By splitting this task into two simpler parts, however,
EM-type algorithms often offer a simple solution. Although this approach ha
s proven useful, in some settings even these simpler tasks are challenging.
In particular, computations involving latent variables are typically diffi
cult to simplify;. Thus, in models such as hierarchical models with complic
ated latent variable structures, computationally intensive methods may be r
equired for the expectation step of EM. This paper describes how nesting tw
o or more EM algorithms can take advantage of closed form conditional expec
tations and lead to algorithms which converge faster, are straightforward t
o implement, and enjoy stable convergence properties. Methodology to monito
r convergence of nested EM, algorithms is developed using importance and br
idge sampling. The strategy is applied to hierarchical probit and t regress
ion models to derive algorithms which incorporate aspects of Monte-Carlo EM
, PX-EM, and nesting in order to combine computational efficiency with easy
implementation.