With a view to finding features of the weight-space of the binary perc
eptron that might be instructive for training binary-synapse neural ne
tworks, the maximally-stable perceptron having binary-valued weights i
s compared with continuous-weight perceptrons, for universal choices o
f stored patterns. The fraction of synaptic-weights correctly predicte
d by clipping the synapses of the continuous network is calculated in
the thermodynamic limit and compared with simulation results for small
er systems. Numerical experiments show good agreement with theory but,
in addition, indicate that those binary synapses likely to be wrongly
predicted by weight-clipping are predominantly those which are weakes
t in the continuous-synapse perceptron. Although not rescuing training
time from growing exponentially in the system size, our results sugge
st ways of significantly accelerating the search for successful, albei
t possibly imperfect, neural networks with discrete-valued couplings.