A NEW SYNTHESIS APPROACH FOR FEEDBACK NEURAL NETWORKS BASED ON THE PERCEPTRON TRAINING ALGORITHM

Authors
Citation
Dr. Liu et Zj. Lu, A NEW SYNTHESIS APPROACH FOR FEEDBACK NEURAL NETWORKS BASED ON THE PERCEPTRON TRAINING ALGORITHM, IEEE transactions on neural networks, 8(6), 1997, pp. 1468-1482
Citations number
33
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
8
Issue
6
Year of publication
1997
Pages
1468 - 1482
Database
ISI
SICI code
1045-9227(1997)8:6<1468:ANSAFF>2.0.ZU;2-2
Abstract
In this paper, a new synthesis approach is developed for associative m emories based on the perceptron training algorithm. The design (synthe sis) problem of feedback neural networks for associative memories is f ormulated as a set of linear inequalities such that the use of percept ron training is evident. The perceptron training in the synthesis algo rithms is guaranteed to converge for the design of neural networks wit hout any constraints on the connection matrix, For neural networks wit h constraints on the diagonal elements of the connection matrix, resul ts concerning the properties of such networks and concerning the exist ence of such a network design are established. For neural networks wit h sparsity and/or symmetry constraints on the connection matrix, desig n algorithms are presented. Applications of the present synthesis appr oach to the design of associative memories realized by means of other feedback neural network models are studied. To demonstrate the applica bility of the present results and to compare the present synthesis app roach with existing design methods, specific examples are considered.