STOCHASTIC CONVERGENCE ANALYSIS OF THE SINGLE-LAYER BACKPROPAGATION ALGORITHM FOR NOISY INPUT DATA

Citation
Nj. Bershad et al., STOCHASTIC CONVERGENCE ANALYSIS OF THE SINGLE-LAYER BACKPROPAGATION ALGORITHM FOR NOISY INPUT DATA, IEEE transactions on signal processing, 44(5), 1996, pp. 1315-1319
Citations number
4
Categorie Soggetti
Engineering, Eletrical & Electronic
ISSN journal
1053587X
Volume
44
Issue
5
Year of publication
1996
Pages
1315 - 1319
Database
ISI
SICI code
1053-587X(1996)44:5<1315:SCAOTS>2.0.ZU;2-2
Abstract
The statistical learning behavior of the single-layer backpropagation algorithm was recently analyzed for a system identification formulatio n for noise-free training data, Transient and steady-state results wer e obtained for the mean weight behavior, mean-square error (MSE), and probability of correct classification. This correspondence extends the se results to the case of noisy training data, Three new analytical re sults are obtained -1) the mean weights converge to finite values, 2) the MSE is bounded away from zero, and 3) the probability of correct c lassification does not converge to unity. However, over a wide range o f signal-to-noise ratio (SNR), the noisy training data does not have a significant effect on the perceptron stationary points relative to th e weight fluctuations. Hence, one concludes that noisy training data h as a relatively small effect on the ability of the perceptron to learn the underlyingweight vector F of the training signal model.