A perceptron whose space of interactions can interpolate between spher
ically constrained and binary-valued synapses is introduced and invest
igated as an associative neural-network memory. For maximally stable s
torage, and where the weight-space remains connected, the critical sto
rage capacity, alpha(c), is found to be reduced by a factor determined
solely by the geometry of the weight space, and is shown to interpola
te, within the replica-symmetric approximation, between alpha(c) = 2 (
in the Gardner-model limit) and alpha(c) = 4/pi. Various comparisons o
f the synaptic weights with those of the binary perceptron show that s
uch differences as remain between this weight space and that of the tr
ue binary perceptron are crucial to obtaining alpha(c) greater-than-or
-equal-to 4/pi. Although these differences limit the use of such model
s in realizing optimal binary networks, they may yet provide worthwhil
e binary systems by simple weight clipping. Simulation results are pre
sented in support of the theoretical analyses.