In a linear regression model, the Dantzig selector (Candès and Tao, 2007) minimizes the L1 norm of the regression coefficients subject to a bound . on the L. norm of the covariances between the predictors and the residuals; the resulting estimator is the solution of a linear program, which may be nonunique or unstable. We propose a regularized alternative to the Dantzig selector. These estimators (which depend on . and an additional tuning parameter r) minimize objective functions that are the sum of the L1 norm of the regression coefficients plus r times the logarithmic potential function of the Dantzig selector constraints, and can be viewed as penalized analytic centers of the latter constraints. The tuning parameter r controls the smoothness of the estimators as functions of . and, when . is sufficiently large, the estimators depend approximately on r and . via r/.2.