VAPNIK-CHERVONENKIS DIMENSION BOUNDS FOR 2-LAYER AND 3-LAYER NETWORKS

Authors
Citation
Pl. Bartlett, VAPNIK-CHERVONENKIS DIMENSION BOUNDS FOR 2-LAYER AND 3-LAYER NETWORKS, Neural computation, 5(3), 1993, pp. 371-373
Citations number
4
Categorie Soggetti
Computer Sciences","Computer Applications & Cybernetics",Neurosciences
Journal title
ISSN journal
08997667
Volume
5
Issue
3
Year of publication
1993
Pages
371 - 373
Database
ISI
SICI code
0899-7667(1993)5:3<371:VDBF2A>2.0.ZU;2-Y
Abstract
We show that the Vapnik-Chervonenkis dimension of the class of functio ns that can be computed by arbitrary two-layer or some completely conn ected three-layer threshold networks with real inputs is at least line ar in the number of weights in the network. In Valiant's ''probably ap proximately correct' learning framework, this implies that the number of random training examples necessary for learning in these networks i s at least linear in the number of weights.