A. Krzyzak et T. Linder, RADIAL BASIS FUNCTION NETWORKS AND COMPLEXITY REGULARIZATION IN FUNCTION LEARNING, IEEE transactions on neural networks, 9(2), 1998, pp. 247-256
In this paper we apply the method of complexity regularization to deri
ve estimation bounds for nonlinear function estimation using a single
hidden layer radial basis function network. Our approach differs from
previous complexity regularization neural-network function learning sc
hemes in that we operate with random covering numbers and l(1) metric
entropy, making it possible to consider much broader families of activ
ation functions, namely functions of bounded variation. Some constrain
ts previously imposed on the network parameters are also eliminated th
is way. The network is trained by means of complexity regularization i
nvolving empirical risk minimization. Bounds on the expected risk in t
erms of the sample size are obtained for a large class of loss functio
ns. Rates of convergence to the optimal loss are also derived.