We estimate the storage capacity of multilayer perceptron with n input
s, hi threshold logic units in the first hidden layer and 1 output. We
show that if the network can memorize 50% of all dichotomies of a ran
domly selected N-tuple of points of R-n with probability 1, then N les
s than or equal to 2(nh(1) + 1), while at 100% memorization N less tha
n or equal to nh(1) + 1. Furthermore, if the bounds are reached, then
the first hidden layer must be fully connected to the input. It is sho
wn that such a nework has memory capacity (in the sense of Cover) betw
een nh(1) + 1 and 2(nh(1) + 1) input patterns and for the most efficie
nt networks in this class between 1 and 2 input patterns per connectio
n. Comparing these results with the recent estimates of VC-dimension w
e find that in contrast to a single neuron case, the VC-dimension exce
eds the capacity for a sufficiently large n and h(1). The results are
based on the derivation of an explicit expression for the number of di
chotomies which can be implemented by such a network for a special cla
ss of N-tuples of input patterns which has a positive probability of b
eing randomly chosen. (C) 1997 Elsevier Science Ltd.