Gaussian codes and Shannon bounds for multiple descriptions

Authors
Citation
R. Zamir, Gaussian codes and Shannon bounds for multiple descriptions, IEEE INFO T, 45(7), 1999, pp. 2629-2636
Citations number
17
Categorie Soggetti
Information Tecnology & Communication Systems
Journal title
IEEE TRANSACTIONS ON INFORMATION THEORY
ISSN journal
00189448 → ACNP
Volume
45
Issue
7
Year of publication
1999
Pages
2629 - 2636
Database
ISI
SICI code
0018-9448(199911)45:7<2629:GCASBF>2.0.ZU;2-F
Abstract
A pair of well-known inequalities due to Shannon upper/lowerbound the rate- distortion function of a real source by the rate-distortion function of the Gaussian source with the same variance/entropy. We extend these bounds to multiple descriptions, a problem for which a general "single-letter" soluti on is not known. We show that the set D-X (R-1, R-2) of achievable marginal (d(1), d(2)) and central (d(0)) mean-squared errors in decoding X from two descriptions at rates R-1 and R-2 satisfies D*(sigma(x)(2) R-1, R-2) subset of or equal to D-X (R-1, R-2) subset of or equal to D*(P-x, R-1, R-2) where sigma(x)(2) and P-x are the variance and the entropy-power of X, resp ectively, and D* (sigma(2). R-1, R-2) is the multiple description distortio n region for a Gaussian source with variance sigma(2) found by Ozarow. We f urther show that like in the single description case, a Gaussian random cod e achieves the outer bound in the limit as d(1), d(2) --> 0, thus the outer bound is asymptotically tight at high resolution conditions.