We study the dynamics of supervised on-line learning of realizable tasks in
feed-forward neural networks. We focus on the regime where the number of e
xamples used for training is proportional to the number of input channels N
. Using generating functional techniques from spin glass theory, we are abl
e to average over the composition of the training set and transform the pro
blem for N --> infinity to an effective single pattern system described com
pletely by the student autocovariance, the student-teacher overlap and the
student response function with exact closed equations. Our method applies t
o arbitrary learning rules, i.e., not necessarily of a gradient-descent typ
e. The resulting exact macroscopic dynamical equations can be integrated wi
thout finite-size effects up to any degree of accuracy, but their main valu
e is in providing an exact and simple starting point for analytical approxi
mation schemes. Finally, we show how, in the region of absent anomalous res
ponse and using the hypothesis that (as in detailed balance systems) the sh
ort-time part of the various operators can be transformed away, one can des
cribe the stationary state of the network succesfully by a set of coupled e
quations involving only four scalar order parameters.