LEARNING AND GENERALIZATION IN A 2-LAYER NEURAL-NETWORK - THE ROLE OFTHE VAPNIK-CHERVONENKIS DIMENSION

Authors
Citation
M. Opper, LEARNING AND GENERALIZATION IN A 2-LAYER NEURAL-NETWORK - THE ROLE OFTHE VAPNIK-CHERVONENKIS DIMENSION, Physical review letters, 72(13), 1994, pp. 2113-2116
Citations number
19
Categorie Soggetti
Physics
Journal title
ISSN journal
00319007
Volume
72
Issue
13
Year of publication
1994
Pages
2113 - 2116
Database
ISI
SICI code
0031-9007(1994)72:13<2113:LAGIA2>2.0.ZU;2-D
Abstract
Bounds for the generalization ability of neural networks based on Vapn ik-Chervonenkis (VC) theory are compared with statistical mechanics re sults for the case of the parity machine. For fixed phase space dimens ion, the VC dimension grows arbitrarily by increasing the number K of hidden units. Generalization is impossible up to a critical number of training examples that grows with the VC dimension. The asymptotic dec rease of the generalization error epsilon(G) comes out independent of K and the VC bounds strongly overestimate epsilon(G). This shows that phase space dimension and VC dimension can play independent and differ ent roles for the generalization process.