BOOSTING AND OTHER ENSEMBLE METHODS

Citation
H. Drucker et al., BOOSTING AND OTHER ENSEMBLE METHODS, Neural computation, 6(6), 1994, pp. 1289-1301
Citations number
22
Categorie Soggetti
Computer Sciences","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
08997667
Volume
6
Issue
6
Year of publication
1994
Pages
1289 - 1301
Database
ISI
SICI code
0899-7667(1994)6:6<1289:BAOEM>2.0.ZU;2-A
Abstract
We compare the performance of three types of neural network-based ense mble techniques to that of a single neural network. The ensemble algor ithms are two versions of boosting and committees of neural networks' trained independently. For each of the four algorithms, we experimenta lly determine the test and training error curves in an optical charact er recognition (OCR) problem as both a function of training set size a nd computational cost using three architectures. We show that a single machine is best for small training set size while for large training set size some version of boosting is best. However, for a given comput ational cost, boosting is always best. Furthermore, we show a surprisi ng result for the original boosting algorithm: namely, that as the tra ining set size increases, the training error decreases until it asympt otes to the test error rate. This has potential implications in the se arch for better training algorithms.