Pl. Bartlett et Rc. Williamson, THE VC DIMENSION AND PSEUDODIMENSION OF 2-LAYER NEURAL NETWORKS WITH DISCRETE INPUTS, Neural computation, 8(3), 1996, pp. 625-628
We give upper bounds on the Vapnik-Chervonenkis dimension and pseudodi
mension of two-layer neural networks that use the standard sigmoid fun
ction or radial basis function and have inputs from {-D, ..., D}''. In
Valiant's probably approximately correct (pac) learning framework for
pattern classification, and in Haussler's generalization of this fram
ework to nonlinear regression, the results imply that the number of tr
aining examples necessary for satisfactory learning performance grows
no more rapidly than W log(WD), where W is the number of weights. The
previous best bound for these networks was O(W-4).