CAPACITY OF THE SINGLE-LAYER PERCEPTRON AND MINIMAL TRAJECTORY TRAINING ALGORITHMS

Authors
Citation
D. Saad, CAPACITY OF THE SINGLE-LAYER PERCEPTRON AND MINIMAL TRAJECTORY TRAINING ALGORITHMS, Journal of physics. A, mathematical and general, 26(15), 1993, pp. 3757-3773
Citations number
18
Categorie Soggetti
Physics
ISSN journal
03054470
Volume
26
Issue
15
Year of publication
1993
Pages
3757 - 3773
Database
ISI
SICI code
0305-4470(1993)26:15<3757:COTSPA>2.0.ZU;2-8
Abstract
The entire set of binary vectors to be stored using a single-layer per ceptron can be divided into two groups, one for which the output neuro n state consistently equals one of the input neuron states and a secon d for which the output neuron state consistently negates the same inpu t neuron. The capacity of the single-layer perceptron depends on the r atio between these two groups. This dependence is examined via statist ical mechanical methods, producing the probability of obtaining a line arly separable solution for a random selection of input-output relatio ns, for a given value of the above ratio. This probability is extremel y useful for designing recurrent neural network training algorithms. T hese algorithms make use of the obtained results to select the most pr obable internal representations to be realized in such nets. Moreover, the distribution of the linearly separable binary functions enables u s to obtain a good estimate for the total number of linearly separable binary functions for a certain number of input neurons, a task consid ered as a hard computational problem. Additional incentives for carryi ng out the calculation are understanding the capacity of simple nets f or certain types of input-output correlations and laying the foundatio ns for analysing some constructive training algorithms such as the til ing and upstart algorithms. All results show consistency with existing theoretical results.