L. Xu et al., ON RADIAL BASIS FUNCTION NETS AND KERNAL REGRESSION - STATISTICAL CONSISTENCY, CONVERGENCE-RATES, AND RECEPTIVE-FIELD SIZE, Neural networks, 7(4), 1994, pp. 609-628
Useful connections between radial basis function (RBF) nets and kernel
regression estimators (KRE) are established. By using existing theore
tical results obtained for KRE as tools, we obtain a number of interes
ting theoretical results for RBF nets. Upper bounds are presented for
convergence rates of the approximation error with respect to the numbe
r of hidden units. The existence of a consistent estimator for RBF net
s is proven constructively. Upper bounds are also provided for the poi
ntwise and L2 convergence rates of the best consistent estimator for R
BF nets as the numbers of both the samples and the hidden units tend t
o infinity. Moreover, the problem of selecting the appropriate size of
the receptive field of the radial basis function is theoretically inv
estigated and the way this selection is influenced by various factors
is elaborated. In addition, some results are also given for the conver
gence of the empirical error obtained by the least squares estimator f
or RBF nets.