An improved neural classification network for the two-group problem

Citation
P. Mangiameli et D. West, An improved neural classification network for the two-group problem, COMPUT OPER, 26(5), 1999, pp. 443-460
Citations number
16
Categorie Soggetti
Engineering Management /General
Journal title
COMPUTERS & OPERATIONS RESEARCH
ISSN journal
03050548 → ACNP
Volume
26
Issue
5
Year of publication
1999
Pages
443 - 460
Database
ISI
SICI code
0305-0548(199904)26:5<443:AINCNF>2.0.ZU;2-9
Abstract
In this paper we present the neural network model known as the mixture-of-e xperts (MOE) and determine its accuracy and its robustness. We do this by c omparing the classification accuracy of MOE, backpropagation neural network (BPN), Fisher's discriminant analysis, logistics regression, k nearest nei ghbor, and the kernel density on five real-world two-group data sets. Our r esults lead to three major conclusions: (1) the MOE network architecture is more accurate than BPN; (2) MOE tends to be more accurate than the paramet ric and non-parametric methods investigated; (3) MOE is a far more robust c lassifier than the other methods for the two-group problem. (C) 1999 Elsevi er Science Ltd. All rights reserved.