The generalized information criterion (GIC) selects a linear regressio
n model by minimizing the sum of squared residuals plus a penalty para
meter lambda times a linear function of the model dimension. It is kno
wn that the GIC is asymptotically consistent in the sense that the err
or probability of selecting a non-optimal model by the GIC converges t
o zero when lambda --> infinity (as the sample size increases to infin
ity) at a certain rate, In the present paper we establish some converg
ence rates fbr the error probabilities of the GIC, in terms of lambda
and the order of the design matrix. The rates obtained here are sharpe
r than the existing ones in the literature when the distribution of th
e response variable is nonnormal. A discussion of the choice of the pe
nalty parameter lambda is also given.