Akaike's information criterion (AIC), derived from asymptotics of the maxim
um likelihood estimator, is widely used in model selection. However, it has
a finite-sample bias that produces overfitting in linear regression. To de
al with this problem, Ishiguro, Sakamoto, and Kitagawa proposed a bootstrap
-based extension to AIC which they called EIC. This article compares model-
selection performance of AIC, EIC, a bootstrap-smoothed likelihood cross-va
lidation (BCV) and its modification (632CV) in small-sample linear regressi
on, logistic regression, and Cox regression. Simulation results show that E
IC largely overcomes AIC's overfitting problem and that BCV may be better t
han EIC. Hence, the three methods based on bootstrapping the likelihood est
ablish themselves as important alternatives to AIC in model selection with
small samples.