Multilayer feedforward networks are often used for modeling complex fu
nctional relationships between data sets. Should a measurable redundan
cy in training data exist, deleting unimportant data components in the
training sets could lead to smallest networks due to reduced-size dat
a vectors. This reduction can be achieved by analyzing the total distu
rbance of network outputs due to perturbed inputs. The search for redu
ndant input data components proposed in the paper is based on the conc
ept of sensitivity in linearized models. The mappings considered are R
(I) --> R(K) with continuous and differentiable outputs, Criteria and
algorithm for inputs' pruning are formulated and illustrated with exam
ples.