FUNCTIONAL APPROXIMATION BY FEEDFORWARD NETWORKS - A LEAST-SQUARES APPROACH TO GENERALIZATION

Authors
Citation
Ar. Webb, FUNCTIONAL APPROXIMATION BY FEEDFORWARD NETWORKS - A LEAST-SQUARES APPROACH TO GENERALIZATION, IEEE transactions on neural networks, 5(3), 1994, pp. 363-379
Citations number
19
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
5
Issue
3
Year of publication
1994
Pages
363 - 379
Database
ISI
SICI code
1045-9227(1994)5:3<363:FABFN->2.0.ZU;2-Z
Abstract
This paper considers a least-squares approach to function approximatio n and generalization. The particular problem addressed is one in which the training data are noiseless (perhaps specified by an assumed mode l or obtained during some calibration procedure) and the requirement i s to define a mapping that approximates the data and that generalizes to situations in which data samples are corrupted by noise in the inpu t variables. The least-squares approach produces a generalizer that ha s the form of a Radial Basis Function network for a finite number of t raining samples. The finite sample approximation is valid provided tha t the perturbations due to noise on the expected operating conditions are large compared to the sample spacing in the data space. In the oth er extreme of small noise perturbations, a particular parametric form must be assumed for the generalizer. It is shown that better generaliz ation will occur if the error criterion used in training the generaliz er is modified by the addition of a specific regularization term. This is illustrated by an approximator that has a feed-forward architectur e and is applied to the problem of point-source location using the out puts of an array of receivers in the focal-plane of a lens.