The least mean square (LMS) algorithm is known to converge in the mean
and in the mean square. However, during short time periods, the error
sequence can blow up and cause severe disturbances; especially for no
n-Gaussian processes. This contribution discusses potential short time
unstable behavior of the LMS algorithm for spherically invariant rand
om processes (SIRP) like Gaussian, Laplacian, and lie. The result of t
his investigation is that the probability for bursting decreases with
the step size. However, since a smaller step size also causes a slower
convergence rate, one has to choose a tradeoff between convergence sp
eed and the frequence of bursting.