Ensemble classifiers and algorithms for learning ensembles have recently re
ceived a great deal of attention in the machine learning literature (R.E. S
chapire, Machine Learning 5(2) (1990) 197-227;N. Cesa-Bianchi, Y. Freund, D
. Haussler, D.P. Helbold, R.E. Schapire, M.K. Warmuth, Proceedings of the 2
5th Annual ACM Symposium on the Theory of Computing, 1993, pp. 382-391; L.
Breiman, Bias, Technical Report 460, Statistics Department, University of C
alifornia, Berkeley, CA, 1996; J.R. Quinlan, Proceedings of the 14th Intern
ational Conference on Machine Learning, Italy, 1997; Y. Freund, R.E. Schapi
re, Proceedings of the 13th International Conference on Machine Learning IC
ML96, Bari, Italy 1996, pp. 148-157; A.J.C. Sharkey, N.E. Sharkey, Combinin
g diverse neural nets, The Knowledge Engineering Review 12 (3) (1997) 231-2
47). In particular, boosting has received a great deal of attention as a me
chanism by which an ensemble of classifiers that has a better generalisatio
n characteristic than any single classifier derived using a particular tech
nique can be discovered. In this article, we examine and compare a number o
f techniques for pruning a classifier ensemble which is overfit on its trai
ning set and find that a real valued GA is at least as good as the best heu
ristic search algorithm for choosing an ensemble weighting. (C) 1999 Elsevi
er Science B.V. All rights reserved.