MULTIPLE TRAINING CONCEPT FOR BACKPROPAGATION NEURAL NETWORKS FOR USEIN ASSOCIATIVE MEMORIES

Citation
Yf. Wang et al., MULTIPLE TRAINING CONCEPT FOR BACKPROPAGATION NEURAL NETWORKS FOR USEIN ASSOCIATIVE MEMORIES, Neural networks, 6(8), 1993, pp. 1169-1175
Citations number
21
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Applications & Cybernetics",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
6
Issue
8
Year of publication
1993
Pages
1169 - 1175
Database
ISI
SICI code
0893-6080(1993)6:8<1169:MTCFBN>2.0.ZU;2-J
Abstract
The multiple training concept first applied to Bidirectional Associati ve Memory training is applied to the back-propagation (BP) algorithm f or use in associative memories. This new algorithm, which assigns diff erent weights to the various pairs in the energy function, is called m ultiple training back-propagation (MTBP). The pair weights are updated during the training phase using the basic differential multiplier met hod (BDMM). A sufficient condition for convergence of the training pha se is that the second derivative of the energy function with respect t o the weights of the synapses is positive along the paths of both syna pse weights and pair weights. A simple example of the use of the algor ithm is provided, followed by two simulations that show that this algo rithm can increase the training speed of the network dramatically.