We present an approach to the statistical mechanics of feedforward neu
ral networks which is based on counting realizable internal representa
tions by utilizing convexity properties of the weight space. For a toy
model, our method yields storage capacities based on an annealed appr
oximation, which are in close agreement with one-step replica symmetry
-breaking results obtained from a standard approach. For a single-laye
r perceptron, a combinatorial result for the number of realizable outp
ut combinations is recovered and generalized to fixed stabilities.