JOINT VARIABLE AND RANK SELECTION FOR PARSIMONIOUS ESTIMATION OF HIGH-DIMENSIONAL MATRICES

Citation
Florentina Bunea et al., JOINT VARIABLE AND RANK SELECTION FOR PARSIMONIOUS ESTIMATION OF HIGH-DIMENSIONAL MATRICES, Annals of statistics , 40(5), 2012, pp. 2359-2388
Journal title
ISSN journal
00905364
Volume
40
Issue
5
Year of publication
2012
Pages
2359 - 2388
Database
ACNP
SICI code
Abstract
We propose dimension reduction methods for sparse, high-dimensional multivariate response regression models. Both the number of responses and that of the predictors may exceed the sample size. Sometimes viewed as complementary, predictor selection and rank reduction are the most popular strategies for obtaining lower-dimensional approximations of the parameter matrix in such models. We show in this article that important gains in prediction accuracy can be obtained by considering them jointly. We motivate a new class of sparse multivariate regression models, in which the coefficient matrix has low rank and zero rows or can be well approximated by such a matrix. Next, we introduce estimators that are based on penalized least squares, with novel penalties that impose simultaneous row and rank restrictions on the coefficient matrix. We prove that these estimators indeed adapt to the unknown matrix sparsity and have fast rates of convergence. We support our theoretical results with an extensive simulation study and two data analyses.