This research focuses on a general class of maximum likelihood problems in
which it is desired to maximise a nonparametric mixture likelihood with fin
itely many known component densities over the set of unknown weight paramet
ers. Convergence of the conventional EM algorithm for this problem is extre
mely slow when the component densities are poorly separated and when the ma
ximum likelihood estimator requires some of the weights to be zero, as the
algorithm can never reach such a boundary point. Alternative methods based
on the principles of EM are developed using a two-stage approach. First, a
new data augmentation scheme provides improved convergence rates in certain
parameter directions. Secondly, two 'cyclic versions' of this data augment
ation are created by changing the missing data formulation between the EM-s
teps; these extend the acceleration directions to the whole parameter space
, giving another order of magnitude increase in convergence rate. Examples
indicate that the new cyclic versions of the data augmentation schemes can
converge up to 500 times faster than the conventional EM algorithm for fitt
ing nonparametric finite mixture models.