The majority of results in computational learning theory are concerned
with concept learning, i.e. with the special case of function learnin
g for classes of functions with range (0, 1). Much less is known about
the theory of learning functions with a larger range such as N or R.
In particular relatively few results exist about the general structure
of common models for function learning, and there are only very few n
ontrivial function classes for which positive learning results have be
en exhibited in any of these models. We introduce in this paper the no
tion of a binary branching adversary tree for function learning, which
allows us to give a somewhat surprising equivalent characterization o
f the optimal learning cost for learning a class of rear-valued functi
ons (in terms of a max-min definition which does not involve any ''lea
rning'' model). Another general structural result of this paper relate
s the cost for learning a union of function classes to the learning co
sts for the individual function classes. Furthermore, we exhibit an ef
ficient learning algorithm for learning convex piecewise linear functi
ons from R(d) into R. Previously, the class of linear functions from R
(d) into R was the only class of functions with multidimensional domai
n that was known to be learnable within the rigorous framework of a fo
rmal model for online learning. Finally we give a sufficient condition
for an arbitrary class F of functions from R into R that allows us to
learn the class of all functions that can be written as the pointwise
maximum of k functions from F. This allows us to exhibit a number of
further nontrivial classes of functions from R into R for which there
exist efficient learning algorithms.