M. Bouten et al., LEARNING IN THE HYPERCUBE - A STEPPING STONE TO THE BINARY PERCEPTRON, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 58(2), 1998, pp. 2378-2385
The learning problem for storing random patterns in a perceptron with
binary weights can be facilitated by pretraining an appropriate precur
sor network with continuous weights. Unlike previous studies which com
pare the performance of different continuous-weight perceptrons on the
hypersphere (spherical constraint), we also consider weight vectors c
onstrained to the volume of the hypercube (cubical constraint). We com
pare the performance of the maximally stable networks on the hypersphe
re and in the hypercube, and show that the latter is superior for pred
icting the weights of the maximally stable binary perceptron. We furth
er determine an upper bound for the fraction of binary weights that an
y precursor is able to predict correctly, and introduce a precursor in
the hypercube that closely approaches this upper bound. We finally de
monstrate the value of this hypercube precursor by carrying out simula
tions for a perceptron with up to 100 weights.