Sharp convergence rates of stochastic approximation algorighms are giv
en for the case where the derivative of the unknown regression functio
n at the sought-for root is zero. The convergence rates obtained are s
harp for the general step size used in the algorithms in contrast to t
he previous work where they are not sharp for slowly decreasing step s
izes; all possible limit points are found for the case where the first
matrix coefficient in the expansion of the regression function is nor
mal; and the estimation upper bound is shown to be achieved for the mu
lti-dimensional case in contrast to the previous work where only the o
ne-dimensional result is proved.