Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators

Citation
Rc. Williamson et al., Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators, IEEE INFO T, 47(6), 2001, pp. 2516-2532
Citations number
63
Categorie Soggetti
Information Tecnology & Communication Systems
Journal title
IEEE TRANSACTIONS ON INFORMATION THEORY
ISSN journal
00189448 → ACNP
Volume
47
Issue
6
Year of publication
2001
Pages
2516 - 2532
Database
ISI
SICI code
0018-9448(200109)47:6<2516:GPORNA>2.0.ZU;2-8
Abstract
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint t hat is apparently novel in the field of statistical learning theory. The hy pothesis class is described in terms of a linear operator mapping from a po ssibly infinite-dimensional unit ball in feature space into a finite-dimens ional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers, which characterize the degr ee of compactness of the operator, can be bounded in terms of the eigenvalu es of an integral operator induced by the kernel function used by the machi ne. As a consequence, we are able to theoretically explain the effect of th e choice of kernel function on the generalization performance of support ve ctor machines.