S. Portnoy et R. Koenker, THE GAUSSIAN HARE AND THE LAPLACIAN TORTOISE - COMPUTABILITY OF SQUARED-ERROR VERSUS ABSOLUTE-ERROR ESTIMATORS, Statistical science, 12(4), 1997, pp. 279-296
Since the time of Gauss, it has been generally accepted that l(2)-meth
ods of combining observations by minimizing sums of squared errors hav
e significant computational advantages over earlier el-methods based o
n minimization of absolute errors advocated by Boscovich, Laplace and
others. However, l(2)-methods are known to have significant robustness
advantages over l(2)-methods in many applications, and related quanti
le regression methods provide a useful, complementary approach to clas
sical least-squares estimation of statistical models. Combining recent
advances in interior point methods for solving linear programs with a
new statistical preprocessing approach for l(1)-type problems, we obt
ain a 10- to 100-fold improvement in computational speeds over current
(simplex-based) l(1)-algorithms in large problems, demonstrating that
l(1)-methods can be made competitive with l(2)-methods in terms of co
mputational speed throughout the entire range of problem sizes. Formal
complexity results suggest that l(1)-regression can be made faster th
an least-squares regression for n sufficiently large and p modest.