IMPROVED BINARY CLASSIFICATION PERFORMANCE USING AN INFORMATION-THEORETIC CRITERION

Citation
P. Burrascano et D. Pirollo, IMPROVED BINARY CLASSIFICATION PERFORMANCE USING AN INFORMATION-THEORETIC CRITERION, Neurocomputing, 13(2-4), 1996, pp. 375-383
Citations number
16
Categorie Soggetti
Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
09252312
Volume
13
Issue
2-4
Year of publication
1996
Pages
375 - 383
Database
ISI
SICI code
0925-2312(1996)13:2-4<375:IBCPUA>2.0.ZU;2-C
Abstract
Feedforward neural networks trained to solve classification problems d efine an approximation of the conditional probabilities P(C-i\x) if th e output units correspond to categories C-i. The present paper shows t hat if a least mean squared error cost function is minimised during tr aining phase, the resulting approximation of the P(C-i\x)s is poor in the ranges of the input variable x where the conditional probabilities take on very low values. The use of the Kullback-Leibler distance mea sure is proposed to overcome this limitation; a cost function derived from this information theoretic measure is defined and a computational ly light training procedure is derived in the case of binary classific ation problems. The effectiveness of the proposed procedure is verifie d by means of comparative experiments.