We demonstrate that the problem of approximately interpolating a targe
t function by a neural network is computationally intractable, In part
icular the interpolation training problem for a neural network with tw
o monotone Lipschitzian sigmoidal internal activation functions and on
e linear output node is shown to be NP-hard and NP-complete if the int
ernal nodes are in addition piecewise ratios of polynomials, This part
ially answers a question of Blum and Rivest concerning the NP-complete
ness of training a logistic sigmoidal 3-node network, An extension of
the result is then given for networks with n monotone sigmoidal intern
al nodes and one convex output node, This indicates that many multivar
iate nonlinear regression problems may be computationally infeasible.