Sensitivity analysis for selective learning by feedforward neural networks

Authors
Citation
Ap. Engelbrecht, Sensitivity analysis for selective learning by feedforward neural networks, FUNDAM INF, 46(3), 2001, pp. 219-252
Citations number
62
Categorie Soggetti
Computer Science & Engineering
Journal title
FUNDAMENTA INFORMATICAE
ISSN journal
01692968 → ACNP
Volume
46
Issue
3
Year of publication
2001
Pages
219 - 252
Database
ISI
SICI code
0169-2968(200105)46:3<219:SAFSLB>2.0.ZU;2-I
Abstract
Research on improving the performance of feedforward neural networks has co ncentrated mostly on the optimal setting of initial weights and learning pa rameters, sophisticated optimization techniques, architecture optimization, and adaptive activation functions. An alternative approach is presented in this paper where the neural network dynamically selects training patterns from a candidate training set during training, using the network's current attained knowledge about the target concept. Sensitivity analysis of the ne ural network output with respect to small input perturbations is used to qu antify the informativeness of candidate patterns. Only the most informative patterns, which are those patterns closest to decision boundaries, are sel ected for training. Experimental results show a significant reduction in th e training set size, without negatively influencing generalization performa nce and convergence characteristics. This approach to selective learning is then compared to an alternative where informativeness is measured as the m agnitude in prediction error.