Pf. Thall et al., VARIABLE SELECTION IN REGRESSION VIA REPEATED DATA SPLITTING, Journal of computational and graphical statistics, 6(4), 1997, pp. 416-434
A new algorithm-backward elimination via repeated data splitting (BERD
S)-is proposed for variable selection in regression. Initially, the da
ta are partitioned into two sets {E, V}, and an exhaustive backward el
imination (BE) is performed in E. For each p value cutoff alpha used i
n BE, the corresponding fitted model from E is validated in V by compu
ting the sum of squared deviations of observed from predicted values.
This is repeated m times, and the ct minimizing the sum of the m sums
of squares is used as the cutoff in a final BE on the entire data set.
BERDS is a modification of the algorithm BECV proposed by Thall, Simo
n, and Grier (1992). An extensive simulation study shows that, compare
d to BECV, BERDS has a smaller model error and higher probabilities of
excluding noise variables, of selecting each of several uncorrelated
true predictors, and of selecting exactly one of two or three highly c
orrelated true predictors. BERDS is also superior to standard BE with
cutoffs .05 or .10, and this superiority increases with the number of
noise variables in the data and the degree of correlation among true p
redictors. An application is provided for illustration.