ENERGY FUNCTIONS FOR MINIMIZING MISCLASSIFICATION ERROR WITH MINIMUM-COMPLEXITY NETWORKS

Authors
Citation
Ba. Telfer et Hh. Szu, ENERGY FUNCTIONS FOR MINIMIZING MISCLASSIFICATION ERROR WITH MINIMUM-COMPLEXITY NETWORKS, Neural networks, 7(5), 1994, pp. 809-818
Citations number
23
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
7
Issue
5
Year of publication
1994
Pages
809 - 818
Database
ISI
SICI code
0893-6080(1994)7:5<809:EFFMME>2.0.ZU;2-Z
Abstract
For automatic target recognition, a neural network is desired that min imizes the number of misclassifications with the minimum network compl exity. Minimizing network complexity is important for both improving g eneralization and simplifying implementation. The least mean squares ( LMS) energy function used in standard back propagation does not always produce such a network. Therefore, two minimum misclassification erro r (MME) energy functions are advanced to achieve this. Examples are gi ven in which LMS requires five times as many hidden units in a multila yer perceptron to achieve test set classification accuracy similar to that achieved with the MME functions. Examples are given to provide in sight into the nature of the LMS performance, namely that LMS approxim ates the a posteriori probabilities and class boundaries emerge indire ctly from this process. The examples also show that the MME functions tend to find local minima less often than LMS does for the same number of hidden units. This is believed to be due to the difference in netw ork complexity needed to accurately approximate a posteriori probabili ties versus class boundaries.