Pw. Goldberg et Mr. Jerrum, BOUNDING THE VAPNIK-CHERVONENKIS DIMENSION OF CONCEPT CLASSES PARAMETERIZED BY REAL NUMBERS, Machine learning, 18(2-3), 1995, pp. 131-148
The Vapnik-Chervonenkis (V-C) dimension is an important combinatorial
tool in the analysis of learning problems in the PAC framework. For po
lynomial learnability, we seek upper bounds on the V-C dimension that
are polynomial in the syntactic complexity of concepts. Such upper bou
nds are automatic for discrete concept classes, but hitherto little ha
s been known about what general conditions guarantee polynomial bounds
on V-C dimension for classes in which concepts and examples are repre
sented by tuples of real numbers. In this paper, we show that for two
general kinds of concept class the V-C dimension is polynomially bound
ed in the number of real numbers used to define a problem instance. On
e is classes where the criterion for membership of an instance in a co
ncept can be expressed as a formula (in the first-order theory of the
reals) with fixed quantification depth and exponentially-bounded lengt
h, whose atomic predicates are polynomial inequalities of exponentiall
y-bounded degree. The other is classes where containment of an instanc
e in a concept is testable in polynomial time, assuming we may compute
standard arithmetic operations on reals exactly in constant time. Our
results show that in the continuous case, as in the discrete, the rea
l barrier to efficient learning in the Occam sense is complexity-theor
etic and not information-theoretic. We present examples to show how th
ese results apply to concept classes defined by geometrical figures an
d neural nets, and derive polynomial bounds on the V-C dimension for t
hese classes.