It is well known that the convergence rate of the expectation-maximization
(EM) algorithm can be faster than those of convention first-order iterative
algorithms when the overlap in the given mixture is small. But this argume
nt has not been mathematically proved yet. This article studies this proble
m asymptotically in the setting of gaussian mixtures under the theoretical
framework of Xu and Jordan (1996). It has been proved that the asymptotic c
onvergence rate of the EM algorithm for gaussian mixtures locally around th
e true solution Theta* is o(e(0.5-epsilon)(Theta*)), where epsilon > 0 is a
n arbitrarily small number, o(x) means that it is a higher-order infinitesi
mal as x --> 0, and e(Theta*) is a measure of the average overlap of gaussi
ans in the mixture. In other words, the large sample local convergence rare
for the EM algorithm tends to be asymptotically superlinear when e(Theta*)
tends to zero.