Pierre Alquier et al., Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Annals of statistics , 47(4), 2019, pp. 2117-2144
We obtain estimation error rates and sharp oracle inequalities for regularization procedures of the form f^.argminf.F(1N.i=1N.f(Xi,Yi)+..f.) when ... is any norm, F is a convex class of functions and . is a Lipschitz loss function satisfying a Bernstein condition over F. We explore both the bounded and sub-Gaussian stochastic frameworks for the distribution of the f(Xi).s, with no assumption on the distribution of the Yi.s. The general results rely on two main objects: a complexity function and a sparsity equation, that depend on the specific setting in hand (loss . and norm ...). As a proof of concept, we obtain minimax rates of convergence in the following problems: (1) matrix completion with any Lipschitz loss function, including the hinge and logistic loss for the so-called 1-bit matrix completion instance of the problem, and quantile losses for the general case, which enables to estimate any quantile on the entries of the matrix; (2) logistic LASSO and variants such as the logistic SLOPE, and also shape constrained logistic regression; (3) kernel methods, where the loss is the hinge loss, and the regularization function is the RKHS norm.