The generalization error of the symmetric and scaled support vector machines

Citation
Jf. Feng et P. Williams, The generalization error of the symmetric and scaled support vector machines, IEEE NEURAL, 12(5), 2001, pp. 1255-1260
Citations number
8
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
ISSN journal
10459227 → ACNP
Volume
12
Issue
5
Year of publication
2001
Pages
1255 - 1260
Database
ISI
SICI code
1045-9227(200109)12:5<1255:TGEOTS>2.0.ZU;2-#
Abstract
It is generally believed that the support vector machine (SVM) optimizes th e generalization error and outperforms other learning machines. We show ana lytically, by concrete examples in the one dimensional case, that the SVM d oes improve the mean and standard deviation of the generalization error by a constant factor, compared to the worst learning machine. Our approach is in terms of extreme value theory and both the mean and variance of the gene ralization error are calculated exactly for all cases considered. We propos e a new version of the SVM (scaled SVM) which can further reduce the mean o f the generalization error of the SVM.