LEARNING IN THE HYPERCUBE - A STEPPING STONE TO THE BINARY PERCEPTRON

Citation
M. Bouten et al., LEARNING IN THE HYPERCUBE - A STEPPING STONE TO THE BINARY PERCEPTRON, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 58(2), 1998, pp. 2378-2385
Citations number
19
Categorie Soggetti
Physycs, Mathematical","Phsycs, Fluid & Plasmas
ISSN journal
1063651X
Volume
58
Issue
2
Year of publication
1998
Part
B
Pages
2378 - 2385
Database
ISI
SICI code
1063-651X(1998)58:2<2378:LITH-A>2.0.ZU;2-L
Abstract
The learning problem for storing random patterns in a perceptron with binary weights can be facilitated by pretraining an appropriate precur sor network with continuous weights. Unlike previous studies which com pare the performance of different continuous-weight perceptrons on the hypersphere (spherical constraint), we also consider weight vectors c onstrained to the volume of the hypercube (cubical constraint). We com pare the performance of the maximally stable networks on the hypersphe re and in the hypercube, and show that the latter is superior for pred icting the weights of the maximally stable binary perceptron. We furth er determine an upper bound for the fraction of binary weights that an y precursor is able to predict correctly, and introduce a precursor in the hypercube that closely approaches this upper bound. We finally de monstrate the value of this hypercube precursor by carrying out simula tions for a perceptron with up to 100 weights.