The information processing abilities of a multilayer neural network with a
number of hidden units scaling as the input dimension are studied using sta
tistical mechanics methods. The mapping from the input layer to the hidden
units is performed by general symmetric Boolean functions, whereas the hidd
en layer is connected to the output by either discrete or continuous coupli
ngs. Introducing an overlap in the space of Boolean functions as order para
meter, the storage capacity is found to scale with the logarithm of the num
ber of implementable Boolean functions. The generalization behavior is smoo
th for continuous couplings and shows a discontinuous transition to perfect
generalization for discrete ones.