The adaptive data-driven emulation and control of mechanical systems are po
pular applications of artificial neural networks in engineering. However, m
ultilayer perceptron training is an ill-posed nonlinear optimization proble
m. This paper explores a method to constrain network parameters so that con
ventional computational techniques for function approximation can be used d
uring training. This was accomplished by forming local basis functions whic
h provide accurate approximation and stable evaluation of the network param
eters. It is noted that this approach is quite general and does not violate
the principles of network architecture. By employing the concept of shift-
invariant subspaces, this approach yields a new and more robust error condi
tion for feedforward artificial neural networks and allows one to both char
acterise and control the accuracy of the local bases formed. The two method
s used are: (1.) adding bases while altering their shape and keeping their
spacing constant, and (2) adding bases while altering their shape and decre
asing their spacing in a coupled fashion. Numerical examples demonstrate th
e usefulness of the proposed approximation of functions and their derivativ
es. (C) 1998 Elsevier Science Ltd. All rights reserved.