NETWORK CONSTRAINTS AND MULTIOBJECTIVE OPTIMIZATION FOR ONE-CLASS CLASSIFICATION

Authors
Citation
Mm. Moya et Dr. Hush, NETWORK CONSTRAINTS AND MULTIOBJECTIVE OPTIMIZATION FOR ONE-CLASS CLASSIFICATION, Neural networks, 9(3), 1996, pp. 463-474
Citations number
31
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
9
Issue
3
Year of publication
1996
Pages
463 - 474
Database
ISI
SICI code
0893-6080(1996)9:3<463:NCAMOF>2.0.ZU;2-P
Abstract
This paper introduces a constrained second-order network with a multip le objective learning algorithm that forms closed hyperellipsoidal dec ision boundaries for one-class classification. The network architectur e has uncoupled constraints that give independent control over each de cision boundary's size, shape, position, and orientation. The architec ture together with the learning algorithm guarantee the formation of p ositive definite eigenvalues for closed hyperellipsoidal decision boun daries. The learning algorithm incorporates two criteria, one that see ks to minimize classification mapping error and another that seeks to minimize the size of the decision boundaries. We consider both additiv e combinations and multiplicative combinations of the individual crite ria, and we present empirical evidence for selecting functional forms of the individual objectives that are bounded and normalized. The resu lting multiple objective criterion allows the decision boundaries to i ncrease or decrease in size as necessary to achieve both within-class generalization and out-of-class generalization without requiring the u se of non-target patterns in the training set. The resulting network l earns compact closed decision boundaries when trained with target data only. We show results of applying the network to the Iris data set (F isher (1936), Annals of Eugenics, 7(2), 179-188). Advantages of this a pproach include its inherent ability for one-class generalization, fre edom from characterizing the non-target class, and the ability to form closed decision boundaries for multi-modal classes that are more comp lex than hyperspheres without requiring inversion of large matrices. P ublished by Elsevier Science Ltd.