Nj. Bershad et al., STOCHASTIC CONVERGENCE ANALYSIS OF THE SINGLE-LAYER BACKPROPAGATION ALGORITHM FOR NOISY INPUT DATA, IEEE transactions on signal processing, 44(5), 1996, pp. 1315-1319
The statistical learning behavior of the single-layer backpropagation
algorithm was recently analyzed for a system identification formulatio
n for noise-free training data, Transient and steady-state results wer
e obtained for the mean weight behavior, mean-square error (MSE), and
probability of correct classification. This correspondence extends the
se results to the case of noisy training data, Three new analytical re
sults are obtained -1) the mean weights converge to finite values, 2)
the MSE is bounded away from zero, and 3) the probability of correct c
lassification does not converge to unity. However, over a wide range o
f signal-to-noise ratio (SNR), the noisy training data does not have a
significant effect on the perceptron stationary points relative to th
e weight fluctuations. Hence, one concludes that noisy training data h
as a relatively small effect on the ability of the perceptron to learn
the underlyingweight vector F of the training signal model.