DIVERGENCE MEASURES BASED ON ENTROPY FAMILIES - A TOOL FOR GUIDING THE GROWTH OF NEURAL NETWORKS

Citation
Hma. Andree et al., DIVERGENCE MEASURES BASED ON ENTROPY FAMILIES - A TOOL FOR GUIDING THE GROWTH OF NEURAL NETWORKS, Network, 7(3), 1996, pp. 533-554
Citations number
26
Categorie Soggetti
Mathematical Methods, Biology & Medicine",Neurosciences,"Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
0954898X
Volume
7
Issue
3
Year of publication
1996
Pages
533 - 554
Database
ISI
SICI code
0954-898X(1996)7:3<533:DMBOEF>2.0.ZU;2-B
Abstract
Divergence measures based on two entropy families are studied. One fam ily contains the entropies of degree alpha and the second family embod ies the entropies of order alpha. The latter entropies are also known as the Renyi entropies. Both types of divergence measures yield effect ive quality functions for guiding the growth and optimization of feedf orward neural networks built of linear threshold units. These function s are of particular value in the multi-category case. Important proper ties of these quality functions include their convexity on the domain of optimization and their greediness to split internal representations . As a consequence of these properties. these quality functions result in compact neural networks with good generalization properties. The s uitability of some divergence measures to serve as a quality function is verified by a benchmark study. The divergence measures discussed in this paper are of great importance for the held of constructive learn ing.