We prove general lower bounds on the number of examples needed for lea
rning function classes within different natural learning models which
are related to pac-learning (and coincide with the pac-learning model
of Valiant in the case of {0, 1}-valued functions). The lower bounds a
re obtained by showing that all nontrivial function classes contain a
''hard binary-valued subproblem.'' Although (at first glance) it seems
to be likely that real-valued function classes are much harder to lea
rn than their hardest binary-valued subproblem, we show that these gen
eral lower bounds cannot be improved by more than a logarithmic factor
. This is done by discussing some natural function classes like nondec
reasing functions or piecewise-smooth functions (the function classes
that were discussed in [M. J. Kearns and R. E. Schapire, Proc. 31st An
nual Symposium on, the Foundations of Computer Science, IEEE Computer
Society Press, Los Alamitos. CA, 1990, pp. 382-392, full version, J. C
omput. System Sci., 48 (1994), pp. 464-497], [D. Kimber and P. M. Long
, Proc. 5th Annual Workshop on Computational Learning Theory, ACM, New
York, 1992, pp. 153-160]) with certain restrictions concerning their
slope.