A dynamical system model is derived for feedforward neural networks with on
e layer of hidden nodes. The model is valid in the vicinity of flat minima
of the cost function that rise due to the formation of clusters of redundan
t hidden nodes with nearly identical outputs. The derivation is carried out
for networks with an arbitrary number of hidden and output nodes and is, t
herefore, a generalization of previous work valid for networks with only tw
o hidden nodes and one output node. The Jacobian matrix of the system is ob
tained, whose eigenvalues characterize the evolution of learning. Flat mini
ma correspond to critical points of the phase plane trajectories and the bi
furcation of the eigenvalues signifies their abandonment. Following the der
ivation of the dynamical model, we show that identification of the hidden n
odes clusters using unsupervised learning techniques enables the applicatio
n of a constrained application (Dynamically Constrained Back Propagation-DC
BP) whose purpose is to facilitate prompt bifurcation of the eigenvalues of
the Jacobian matrix and, thus, accelerate teaming. DCBP is applied to stan
dard benchmark tasks either autonomously or as an aid to other standard lea
rning algorithms in the vicinity of flat minima. Its application leads to s
ignificant reduction in the number of required epochs for convergence. (C)
2001 Published by Elsevier Science Ltd.