It is difficult to find a good fit of a combination of Gaussians to arbitra
ry empirical data. The surface defined by the objective function contains m
any local minima, which trap gradient descent algorithms and cause stochast
ic methods to tarry unreasonably in the vicinity. A number of techniques fo
r accelerating convergence when using simulated annealing are presented. Th
ese are tested on a sample of known Gaussian combinations and are compared
for accuracy and resource consumption. A single 'best' set of techniques is
found which gives good results on the test samples and on empirical data.