We present a lower bound on the probability of symbol error for maximum-lik
elihood decoding of lattices and lattice codes on a Gaussian channel. The b
ound is tight for error probabilities and signal-to-noise ratios of practic
al interest, as opposed to most existing bounds that become tight asymptoti
cally for high signal-to-noise ratios. The bound is also universal; it prov
ides a limit on the highest possible coding gain that may be achieved, at s
pecific symbol error probabilities, using any lattice or lattice code in n
dimensions. In particular, it is shown that the effective coding gains of t
he densest known lattices are much lower than their nominal coding gains, T
he asymptotic (as n --> infinity) behavior of the new bound is shown to coi
ncide with the Shannon limit for Gaussian channels.