N. Kofidis et al., Chaotic properties and pattern competition during the learning phase of back propagation neural networks, INT J COM M, 74(4), 2000, pp. 407-437
This paper examines the chaotic behavior of Back Propagation neural network
s during the training phase. The networks are trained using ordinary parame
ter values, while two different cases are considered. In the first one, the
network does not meet desirable convergence within a pre-specified number
of epochs. Chaotic behavior of this network is depicted by examining the va
lues of the dominant Lyapunov exponents of the weight data series produced
by additional training. For each training epoch, the data series representi
ng input patterns producing the minimum absolute error in output during add
itional training, is also subjected to Lyapunov exponent investigation. The
task of this investigation is to determine whether the network exhibits ch
aotic pattern competition of the best learned inputs. In the second case, t
he network is improved and desirable convergence is accomplished. Again, in
vestigation focuses on the series of values representing input patterns tha
t produce outputs with minimum absolute error. The results obtained from do
minant Lyapunov exponent estimations show that chaotic pattern competition
is still present, despite the fact that the network practically satisfies s
tability demands within predetermined accuracy limits. The best estimation
series consist of the output values corresponding to the best learned input
patterns. These series are examined using the theoretical tool of topologi
cal conjugacy, in addition to numerical verification of the results.