We consider the problem of learning real-valued functions from random
examples when the function values are corrupted with noise. With mild
conditions on independent observation noise, we provide characterizati
ons of the learnability of a real-valued function class in terms of a
generalization of the Vapnik-Chervonenkis dimension, the fat-shatterin
g function, introduced by Kearns and Schapire. We show that, given som
e restrictions on the noise, a function class is learnable in our mode
l if an only if its fat-shattering function is finite. With different
(also quite mild) restrictions, satisfied for example by guassion nois
e, we show that a function class is learnable from polynomially many e
xamples if and only if its fat-shattering function grows polynomially.
We prove analogous results in an agnostic setting, where there is no
assumption of an underlying function class. (C) 1996 Academic Press. I
nc.