In this paper we study a new view on the PAC-learning model in which t
he examples are more complicated than in the standard model. There, an
example usually is an element of the learning domain and its label in
dicates whether it belongs to the target concept. Here, the examples c
an be subsets and their labels indicate some relation to the target co
ncept, e.g., whether they intersect it or not. We show how this settin
g can be easily transformed into the standard PAC-model; however, for
an analysis it is much more natural to stick to the original formulati
on. Then the central notion is that of the relative dimension of a tar
get class with respect to a sample class, which replaces the Vapnik-Ch
ervonenkis dimension. The investigation of structural aspects of the r
elative dimension is followed by its applications to learning environm
ents. It turns out that computing or bounding the relative dimension l
eads to interesting combinatorial problems even for simple target and
sample classes. Sometimes the analysis is easier if one represents the
concepts as unions or intersections of simpler ones. We present sharp
bounds on the relative and the Vapnik-Chervonentis dimension of the c
omplicated class in terms of the simpler one.