We propose a new class of support vector algorithms for regression and clas
sification. In these algorithms, a parameter nu lets one effectively contro
l the number of support vectors. While this can be useful in its own right,
the parameterization has the additional benefit of enabling us to eliminat
e one of the other free parameters of the algorithm: the accuracy parameter
epsilon in the regression case, and the regularization constant C in the c
lassification case. We describe the algorithms, give some theoretical resul
ts concerning the meaning and the choice of nu, and report experimental res
ults.