We solve the dynamics of on-line Hebbian learning in large perceptrons exac
tly, for the regime where the size of the training set scales linearly with
the number of inputs. We consider both noiseless and noisy teachers. Our c
alculation cannot be extended to non-Hebbian rules, but the solution provid
es a convenient and welcome benchmark with which to test more general and a
dvanced theories for solving the dynamics of learning with restricted train
ing sets.