Devices such as neural networks typically approximate the elements of some
function space X by elements of a nontrivial finite union M of finite-dimen
sional spaces. It is shown that if X = L-p(Omega) (I < p < infinity and Ome
ga subset of R-d), then for any positive constant Gamma and any continuous
function phi from X to M, parallel to f - phi (f)parallel to > parallel to
f - M parallel to + Gamma for some f in X. Thus, no continuous finite neura
l network approximation can be within any positive constant of a best appro
ximation in the L-p-norm.