There are two archetypal ways to control the complexity of a flexible regre
ssor: subset selection and ridge regression. in neural-networks jargon, the
y are, respectively, known as pruning and weight decay. These techniques ma
y also be adapted to estimate which features of the input space are relevan
t for predicting the output variables. Relevance is given by a binary indic
ator for subset selection, and by a continuous rating for ridge regression.
This paper shows how to achieve such a rating for a multilayer perceptron
trained with noise (or jitter). Noise injection (NT) is modified in order t
o penalize heavily irrelevant features. The proposed algorithm is attractiv
e as it requires the tuning of a single parameter. This parameter controls
the complexity of the model (effective number of parameters) together,vith
the rating of feature relevances (effective input space dimension). Bounds
on the effective number of parameters support that the stability of this ad
aptive scheme is enforced by the constraints applied to the admissible set
of relevance indices. The good properties of the algorithm are confirmed by
satisfactory experimental results on simulated data sets.