Y. Li et al., An enhanced training algorithm for multilayer neural networks based on reference output of hidden layer, NEURAL C AP, 8(3), 1999, pp. 218-225
In this paper, the authors propose a new training algorithm which does nor
only rely upon the training samples, but also depends upon the output of th
e hidden layer. We adjust both the connecting weights and outputs of the hi
dden layer based on Least Square Backpropagation (LSB) algorithm. A set of
'required' outputs of the hidden layer is added to the input sets through a
feedback path to accelerate the convergence speed. The numerical simulatio
n results have demonstrated that the algorithm is better than conventional
BP, Quasi-Newton BFGS (an alternative to the conjugate gradient methods for
fast optimisation) and LSB algorithms in terms of convergence speed and tr
aining error. The proposed method does not suffer from the drawback of the
LSB algorithm, for which the training error cannot be further reduced after
three iterations.