Sensitivity analysis of multilayer perceptron to input and weight perturbations

Citation
Xq. Zeng et Ds. Yeung, Sensitivity analysis of multilayer perceptron to input and weight perturbations, IEEE NEURAL, 12(6), 2001, pp. 1358-1366
Citations number
14
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
ISSN journal
10459227 → ACNP
Volume
12
Issue
6
Year of publication
2001
Pages
1358 - 1366
Database
ISI
SICI code
1045-9227(200111)12:6<1358:SAOMPT>2.0.ZU;2-D
Abstract
An important issue in the design and implementation of a neural network is the sensitivity of its output to input and weight perturbations. In this pa per, we discuss the sensitivity of the most popular and general feedforward neural networks-multilayer perceptron (MLP). The sensitivity is defined as the mathematical expectation of the output errors of the MLP due to input and weight perturbations with respect to all input and weight values in a g iven continuous interval. The sensitivity for a single neuron is discussed first and an analytical expression that is a function of the absolute value s of input and weight perturbations is approximately derived. Then an algor ithm is given to compute the sensitivity for the entire MLP. As intuitively expected, the sensitivity increases with input and weight perturbations, b ut the increase has an upper bound that is determined by the structural con figuration of the MLP, namely the number of neurons per layer and the numbe r of layers. There exists an optimal value for the number of neurons in a l ayer, which yields the highest sensitivity value. The effect caused by the number of layers is quite unexpected. The sensitivity of a neural network m ay decrease at first and then almost keeps constant while the number increa ses.