In this paper we present the neural network model known as the mixture-of-e
xperts (MOE) and determine its accuracy and its robustness. We do this by c
omparing the classification accuracy of MOE, backpropagation neural network
(BPN), Fisher's discriminant analysis, logistics regression, k nearest nei
ghbor, and the kernel density on five real-world two-group data sets. Our r
esults lead to three major conclusions: (1) the MOE network architecture is
more accurate than BPN; (2) MOE tends to be more accurate than the paramet
ric and non-parametric methods investigated; (3) MOE is a far more robust c
lassifier than the other methods for the two-group problem. (C) 1999 Elsevi
er Science Ltd. All rights reserved.