In neural network theory the complexity of constructing networks to approxi
mate input-output functions is of interest. We study this in the more gener
al context of approximating elements f of a normed space F using partial in
formation about f We assume information about f and the size of the network
are limited, as is typical in radial basis function networks. We show comp
lexity can be essentially split into two independent parts, information eps
ilon -complexity and neural epsilon -complexity. We use a worst case settin
g, and integrate elements of information-based complexity and nonlinear app
roximation. We consider deterministic and/or randomized approximations usin
g information possibly corrupted by noise. The results are illustrated by e
xamples including approximation by piecewise polynomial neural networks. (C
) 2001 Academic Press.