This paper studies the question of lower bounds on the number of neurons an
d examples necessary to program a given task into feedforward neural networ
ks. We introduce the notion of information complexity of a network to compl
ement that of neural complexity. Neural complexity deals with lower bounds
for neural resources (numbers of neurons) needed by a network to perform a
given task within a given tolerance. Information complexity measures lower
bounds for the information (i.e. number of examples) needed about the desir
ed input-output function. We study the interaction of the two complexities,
and so lower bounds for the complexity of building and then programming fe
ed-forward nets for given tasks. We show something unexpected a priori-the
interaction of the two can be simply bounded, so that they can be studied e
ssentially independently. We construct radial basis function (RBF) algorith
ms of order n(3) that are information-optimal, and give example application
s. (C) 2000 Elsevier Science Ltd. all rights reserved.