The original arid most widely studied PAC model for learning assumes a
passive learner in the sense that the learner plays no role in obtain
ing information about the unknown concept. That is, the samples are si
mply drawn independently from some probability distribution. Some work
has been done on studying more powerful oracles and how they affect l
earnability. To find bounds on the improvement in sample complexity th
at can be expected from using oracles, we consider active learning in
the sense that the learner has complete control over the information r
eceived. Specifically, we allow the learner to ask arbitrary yes/no qu
estions. We consider both active learning under a fixed distribution a
nd distribution-free active learning. In the case of active learning,
the underlying probability distribution is used only to measure distan
ce between concepts. For learnability with respect to a fixed distribu
tion, active learning does not enlarge the set of learnable concept cl
asses, but can improve the sample complexity. For distribution-free le
arning, it is shown that a concept class is actively learnable iff it
is finite, so that active learning is in fact less powerful than the u
sual passive learning model. We also consider a form of distribution-f
ree learning in which the learner knows the distribution being used, s
o that ''distribution-free'' refers only to the requirement that a bou
nd on the number of queries can be obtained uniformly over all distrib
utions. Even with the side information of the distribution being used,
a concept class is actively learnable iff it has finite VC dimension,
so that active learning with the side information still does not enla
rge the set of learnable concept classes.