COMPARING SUPPORT VECTOR MACHINES WITH GAUSSIAN KERNELS TO RADIAL BASIS FUNCTION CLASSIFIERS

Citation
B. Scholkopf et al., COMPARING SUPPORT VECTOR MACHINES WITH GAUSSIAN KERNELS TO RADIAL BASIS FUNCTION CLASSIFIERS, IEEE transactions on signal processing, 45(11), 1997, pp. 2758-2765
Citations number
27
Categorie Soggetti
Engineering, Eletrical & Electronic
ISSN journal
1053587X
Volume
45
Issue
11
Year of publication
1997
Pages
2758 - 2765
Database
ISI
SICI code
1053-587X(1997)45:11<2758:CSVMWG>2.0.ZU;2-R
Abstract
The support vector (SV) machine is a novel type of learning machine, b ased on statistical learning theory, which contains polynomial classif iers, neural networks, and radial basis function (RBF) networks as spe cial cases. In the RBF case, the SV algorithm automatically determines centers, weights, and threshold that minimize an upper bound on the e xpected test error. The present study is devoted to an experimental co mparison of these machines with a classical approach, where the center s are determined by k-means clustering, and the weights are computed u sing error backpropagation. We consider three machines, namely, a clas sical RBF machine, an SV machine with Gaussian kernel, and a hybrid sy stem with the centers determined by the SV method and the weights trai ned by error backpropagation. Our results show that on the United Stat es postal service database of handwritten digits, the SV machine achie ves the highest recognition accuracy, followed by the hybrid system. T he SV approach is thus not only theoretically well-founded but also su perior in a practical application.