NETWORK GENERALIZATION DIFFERENCES QUANTIFIED

Authors
Citation
D. Partridge, NETWORK GENERALIZATION DIFFERENCES QUANTIFIED, Neural networks, 9(2), 1996, pp. 263-271
Citations number
10
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
9
Issue
2
Year of publication
1996
Pages
263 - 271
Database
ISI
SICI code
0893-6080(1996)9:2<263:NGDQ>2.0.ZU;2-X
Abstract
It has long been observed, and frequently noted, by connectionists tha t small changes in initial conditions, prior to training, can result i n networks that generalize very differently. We have performed a syste matic study of this phenomenon, using a number of different statistica l measures of generalization differences. From these we derive a forma l definition of Generalization Diversity. We quantify the relative imp acts on generalization of the major parameters used in network initial ization as well as extend the formal framework to also encompass the d ifferences in generalization difference from one parameter to another. We reveal, for example, the relative effects of random initialization of the link weights and variation of the number of hidden units, and how similar these two resultant effects are. Finally, examples are pre sented of how the proposed generalization diversity measure may be exp loited in order to improve the performance of neural-net systems. We s how how several of these measures can be used to engineer reliability improvements in neural-net systems.