Dr. Liu et Zj. Lu, A NEW SYNTHESIS APPROACH FOR FEEDBACK NEURAL NETWORKS BASED ON THE PERCEPTRON TRAINING ALGORITHM, IEEE transactions on neural networks, 8(6), 1997, pp. 1468-1482
In this paper, a new synthesis approach is developed for associative m
emories based on the perceptron training algorithm. The design (synthe
sis) problem of feedback neural networks for associative memories is f
ormulated as a set of linear inequalities such that the use of percept
ron training is evident. The perceptron training in the synthesis algo
rithms is guaranteed to converge for the design of neural networks wit
hout any constraints on the connection matrix, For neural networks wit
h constraints on the diagonal elements of the connection matrix, resul
ts concerning the properties of such networks and concerning the exist
ence of such a network design are established. For neural networks wit
h sparsity and/or symmetry constraints on the connection matrix, desig
n algorithms are presented. Applications of the present synthesis appr
oach to the design of associative memories realized by means of other
feedback neural network models are studied. To demonstrate the applica
bility of the present results and to compare the present synthesis app
roach with existing design methods, specific examples are considered.