RADIAL BASIS FUNCTION NETWORKS AND COMPLEXITY REGULARIZATION IN FUNCTION LEARNING

Citation
A. Krzyzak et T. Linder, RADIAL BASIS FUNCTION NETWORKS AND COMPLEXITY REGULARIZATION IN FUNCTION LEARNING, IEEE transactions on neural networks, 9(2), 1998, pp. 247-256
Citations number
31
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
9
Issue
2
Year of publication
1998
Pages
247 - 256
Database
ISI
SICI code
1045-9227(1998)9:2<247:RBFNAC>2.0.ZU;2-#
Abstract
In this paper we apply the method of complexity regularization to deri ve estimation bounds for nonlinear function estimation using a single hidden layer radial basis function network. Our approach differs from previous complexity regularization neural-network function learning sc hemes in that we operate with random covering numbers and l(1) metric entropy, making it possible to consider much broader families of activ ation functions, namely functions of bounded variation. Some constrain ts previously imposed on the network parameters are also eliminated th is way. The network is trained by means of complexity regularization i nvolving empirical risk minimization. Bounds on the expected risk in t erms of the sample size are obtained for a large class of loss functio ns. Rates of convergence to the optimal loss are also derived.