Enlarging the margins in perceptron decision trees

Citation
Kp. Bennett et al., Enlarging the margins in perceptron decision trees, MACH LEARN, 41(3), 2000, pp. 295-313
Citations number
30
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
MACHINE LEARNING
ISSN journal
08856125 → ACNP
Volume
41
Issue
3
Year of publication
2000
Pages
295 - 313
Database
ISI
SICI code
0885-6125(200012)41:3<295:ETMIPD>2.0.ZU;2-L
Abstract
Capacity control in perceptron decision trees is typically performed by con trolling their size. We prove that other quantities can be as relevant to r educe their flexibility and combat overfitting. In particular, we provide a n upper bound on the generalization error which depends both on the size of the tree and on the margin of the decision nodes. So enlarging the margin in perceptron decision trees will reduce the upper bound on generalization error. Based on this analysis, we introduce three new algorithms, which can induce large margin perceptron decision trees. To assess the effect of the large margin bias, OC1 (Journal of Artificial Intelligence Research, 1994, 2, 1-32.) of Murthy, Kasif and Salzberg, a well-known system for inducing perceptron decision trees, is used as the baseline algorithm. An extensive experimental study on real world data showed that all three new algorithms perform better or at least not significantly worse than OC1 on almost every dataset with only one exception. OC1 performed worse than the best margin- based method on every dataset.