ON RADIAL BASIS FUNCTION NETS AND KERNAL REGRESSION - STATISTICAL CONSISTENCY, CONVERGENCE-RATES, AND RECEPTIVE-FIELD SIZE

Citation
L. Xu et al., ON RADIAL BASIS FUNCTION NETS AND KERNAL REGRESSION - STATISTICAL CONSISTENCY, CONVERGENCE-RATES, AND RECEPTIVE-FIELD SIZE, Neural networks, 7(4), 1994, pp. 609-628
Citations number
45
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
7
Issue
4
Year of publication
1994
Pages
609 - 628
Database
ISI
SICI code
0893-6080(1994)7:4<609:ORBFNA>2.0.ZU;2-S
Abstract
Useful connections between radial basis function (RBF) nets and kernel regression estimators (KRE) are established. By using existing theore tical results obtained for KRE as tools, we obtain a number of interes ting theoretical results for RBF nets. Upper bounds are presented for convergence rates of the approximation error with respect to the numbe r of hidden units. The existence of a consistent estimator for RBF net s is proven constructively. Upper bounds are also provided for the poi ntwise and L2 convergence rates of the best consistent estimator for R BF nets as the numbers of both the samples and the hidden units tend t o infinity. Moreover, the problem of selecting the appropriate size of the receptive field of the radial basis function is theoretically inv estigated and the way this selection is influenced by various factors is elaborated. In addition, some results are also given for the conver gence of the empirical error obtained by the least squares estimator f or RBF nets.