Artificial neural networks are comprised of an interconnected collecti
on of certain nonlinear devices; examples of commonly used devices inc
lude linear threshold elements, sigmoidal elements, and radial-basis e
lements. We employ results from harmonic analysis and the theory of ra
tional approximation to obtain almost tight lower bounds on the size (
i.e, number of elements) of neural networks. The class of neural netwo
rks to which our techniques can be applied is quite general; it includ
es any feedforward network in which each element can be piecewise appr
oximated by a low degree rational function. For example, we prove that
any depth-d + 1 network of sigmoidal units or linear threshold elemen
ts computing the parity function of n variables must have OMEGA(dn1/d-
epsilon) size, for any fixed epsilon > 0. In addition, we prove that t
his lower bound is almost tight by showing that the parity function ca
n be computed with 0(dn1/d) sigmoidal units or linear threshold elemen
ts in a depth-(d + 1) network. These almost tight bounds are the first
known complexity results on the size of neural networks with depth mo
re than two. Our lower bound techniques yield a unified approach to th
e complexity analysis of various models of neural networks with feedfo
rward structures. Moreover, our results indicate that the in the conte
xt of computing highly oscillating symmetric Boolean functions, networ
ks of continuous-output units such as sigmoidal elements do not offer
significant reduction in size compared with networks of linear thresho
ld elements of binary outputs.