THE VC DIMENSION AND PSEUDODIMENSION OF 2-LAYER NEURAL NETWORKS WITH DISCRETE INPUTS

Citation
Pl. Bartlett et Rc. Williamson, THE VC DIMENSION AND PSEUDODIMENSION OF 2-LAYER NEURAL NETWORKS WITH DISCRETE INPUTS, Neural computation, 8(3), 1996, pp. 625-628
Citations number
10
Categorie Soggetti
Computer Sciences","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
08997667
Volume
8
Issue
3
Year of publication
1996
Pages
625 - 628
Database
ISI
SICI code
0899-7667(1996)8:3<625:TVDAPO>2.0.ZU;2-6
Abstract
We give upper bounds on the Vapnik-Chervonenkis dimension and pseudodi mension of two-layer neural networks that use the standard sigmoid fun ction or radial basis function and have inputs from {-D, ..., D}''. In Valiant's probably approximately correct (pac) learning framework for pattern classification, and in Haussler's generalization of this fram ework to nonlinear regression, the results imply that the number of tr aining examples necessary for satisfactory learning performance grows no more rapidly than W log(WD), where W is the number of weights. The previous best bound for these networks was O(W-4).