Robust inference with knockoffs

Citation
Rina Foygel Barber et al., Robust inference with knockoffs, Annals of statistics , 48(3), 2020, pp. 1409-1431
Journal title
ISSN journal
00905364
Volume
48
Issue
3
Year of publication
2020
Pages
1409 - 1431
Database
ACNP
SICI code
Abstract
We consider the variable selection problem, which seeks to identify important variables influencing a response Y out of many candidate features X1,.,Xp. We wish to do so while offering finite-sample guarantees about the fraction of false positives.selected variables Xj that in fact have no effect on Y after the other features are known. When the number of features p is large (perhaps even larger than the sample size n), and we have no prior knowledge regarding the type of dependence between Y and X, the model-X knockoffs framework nonetheless allows us to select a model with a guaranteed bound on the false discovery rate, as long as the distribution of the feature vector X=(X1,.,Xp) is exactly known. This model selection procedure operates by constructing .knockoff copies. of each of the p features, which are then used as a control group to ensure that the model selection algorithm is not choosing too many irrelevant features. In this work, we study the practical setting where the distribution of X can only be estimated, rather than known exactly, and the knockoff copies of the Xj.s are therefore constructed somewhat incorrectly. Our results, which are free of any modeling assumption whatsoever, show that the resulting model selection procedure incurs an inflation of the false discovery rate that is proportional to our errors in estimating the distribution of each feature Xj conditional on the remaining features {Xk:k.j}. The model-X knockoffs framework is therefore robust to errors in the underlying assumptions on the distribution of X, making it an effective method for many practical applications, such as genome-wide association studies, where the underlying distribution on the features X1,.,Xp is estimated accurately but not known exactly.