We refine the first theorem of (R.E. Schapire, Y. Singer, in: Proceedings o
f the 11th Annual ACM Conference on Computational Learning Theory, 1998, pp
. 80-91) bounding the error of the ADABOOST boosting algorithm, to integrat
e Bayes risk. This suggests the significant time savings could be obtained
on some domains without damaging the solution. An applicative example is gi
ven in the field of feature selection. (C) 2001 Elsevier Science B.V. All r
ights reserved.