We derive a new upper bound on the exponent of error probability of decodin
g for the best possible codes in the Gaussian channel. This bound is tighte
r than the known upper bounds (the sphere-packing and minimum-distance boun
ds proved in Shannon's classical 1959 paper and their low-rate improvement
by Kabatiansky and Levenshtein), The proof is accomplished by studying asym
ptotic properties of codes on the sphere Sn-1(FB). First we prove a general
lower bound on the distance distribution of codes of large size, To derive
specific estimates of the distance distribution, we study the asymptotic b
ehavior of Jacobi polynomials P-k(ak, bk) as k --> infinity.
Since on the average there are many code vectors in the vicinity of the tra
nsmitted vector x, one can show that the probability of confusing x and one
of these vectors cannot be too small, This proves a lower bound on the err
or probability of decoding and the upper bound announced in the title.