M. Opper, LEARNING AND GENERALIZATION IN A 2-LAYER NEURAL-NETWORK - THE ROLE OFTHE VAPNIK-CHERVONENKIS DIMENSION, Physical review letters, 72(13), 1994, pp. 2113-2116
Bounds for the generalization ability of neural networks based on Vapn
ik-Chervonenkis (VC) theory are compared with statistical mechanics re
sults for the case of the parity machine. For fixed phase space dimens
ion, the VC dimension grows arbitrarily by increasing the number K of
hidden units. Generalization is impossible up to a critical number of
training examples that grows with the VC dimension. The asymptotic dec
rease of the generalization error epsilon(G) comes out independent of
K and the VC bounds strongly overestimate epsilon(G). This shows that
phase space dimension and VC dimension can play independent and differ
ent roles for the generalization process.