Yf. Wang et al., MULTIPLE TRAINING CONCEPT FOR BACKPROPAGATION NEURAL NETWORKS FOR USEIN ASSOCIATIVE MEMORIES, Neural networks, 6(8), 1993, pp. 1169-1175
The multiple training concept first applied to Bidirectional Associati
ve Memory training is applied to the back-propagation (BP) algorithm f
or use in associative memories. This new algorithm, which assigns diff
erent weights to the various pairs in the energy function, is called m
ultiple training back-propagation (MTBP). The pair weights are updated
during the training phase using the basic differential multiplier met
hod (BDMM). A sufficient condition for convergence of the training pha
se is that the second derivative of the energy function with respect t
o the weights of the synapses is positive along the paths of both syna
pse weights and pair weights. A simple example of the use of the algor
ithm is provided, followed by two simulations that show that this algo
rithm can increase the training speed of the network dramatically.