New results are proved on the convergence of the Shannon lower bound t
o the rate distortion function as the distortion decreases to zero. Th
e key convergence result is proved using a fundamental property of inf
ormational divergence. As a corollary, it is shown that the Shannon lo
wer bound is asymptotically tight for norm-based distortions, when the
source vector has a finite differential entropy and a finite alpha th
moment for some alpha > 0, with respect to the given norm. Moreover,
we derive a theorem of Linkov on the asymptotic tightness of the Shann
on lower bound for general difference distortion measures with more re
laxed conditions on the source density. We also show that the Shannon
lower bound relative to a stationary source and single-letter differen
ce distortion is asymptotically tight under very weak assumptions on t
he source distribution.