We propose a new criterion for model selection in prediction problems. The
covariance inflation criterion adjusts the training error by the average co
variance of the predictions and responses, when the prediction rule is appl
ied to permuted versions of the data set. This criterion can be applied to
general prediction problems (e.g. regression or classification) and to gene
ral prediction rules (e.g, stepwise regression, tree-based models and neura
l nets). As a by-product we obtain a measure of the effective number of par
ameters used by an adaptive procedure. We relate the covariance inflation c
riterion to other model selection procedures and illustrate its use in some
regression and classification problems. We also revisit the conditional bo
otstrap approach to model selection.