Local asymptotic arguments imply that parameter selection via the minimum d
escription length (MDL) resembles a traditional hypothesis test, A common a
pproximation for MDL estimates the cost of adding a parameter at about (1/2
)log n bits for a model fit to n observations, While accurate for parameter
s which are large on a standardized scale, this approximation overstates th
e parameter cost near zero, We find that encoding the parameter produces a
shorter description length when the corresponding estimator is about two st
andard errors away from zero, as in a traditional statistical hypothesis te
st.