PRUNING BACKPROPAGATION NEURAL NETWORKS USING MODERN STOCHASTIC OPTIMIZATION TECHNIQUES

Citation
Sw. Stepniewski et Aj. Keane, PRUNING BACKPROPAGATION NEURAL NETWORKS USING MODERN STOCHASTIC OPTIMIZATION TECHNIQUES, NEURAL COMPUTING & APPLICATIONS, 5(2), 1997, pp. 76-98
Citations number
27
Categorie Soggetti
Computer Sciences, Special Topics","Computer Science Artificial Intelligence
ISSN journal
09410643
Volume
5
Issue
2
Year of publication
1997
Pages
76 - 98
Database
ISI
SICI code
0941-0643(1997)5:2<76:PBNNUM>2.0.ZU;2-O
Abstract
Approaches combining genetic algorithms and neural networks have recei ved a great deal of attention in recent years. As a result, much work has been reported in two major areas of neural network design: trainin g and topology optimisation. This paper focuses on the key issues asso ciated with the problem of pruning a multi-layer perceptron using gene tic algorithms and simulated annealing. The study presented considers a number of aspects associated with network training that may alter th e behaviour of a stochastic topology optimiser. Enhancements are discu ssed that can improve topology searches. Simulation results for the tw o mentioned stochastic optimisation methods applied to non-linear syst em identification are presented and compared with a simple random sear ch.