CONSISTENCY OF MULTILAYER PERCEPTRON REGRESSION-ESTIMATORS

Citation
J. Mielniczuk et J. Tyrcha, CONSISTENCY OF MULTILAYER PERCEPTRON REGRESSION-ESTIMATORS, Neural networks, 6(7), 1993, pp. 1019-1022
Citations number
13
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Applications & Cybernetics",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
6
Issue
7
Year of publication
1993
Pages
1019 - 1022
Database
ISI
SICI code
0893-6080(1993)6:7<1019:COMPR>2.0.ZU;2-U
Abstract
In the paper three layer perceptron with one hidden layer and the outp ut layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron min imizing the mean squared error criterion for the data points x(k), y(k )), k = 1, .... N is sought. It is shown that in the model: y(k) = g0( X(k)) + epsilon(k), k = 1, .... N, where x(k) is independent from zero mean error term epsilon(k), this procedure is consistent when N --> i nfinity, provided that g0 is represented as three layer perceptron wit h Heaviside transfer fucntion. The same result is true when transfer f unction is an arbitrary continuous function with bounded limits at +/- infinity and the hidden-lo-output weights in the considered family of perceptrons are bounded.