RELATIONS BETWEEN ENTROPY AND ERROR-PROBABILITY

Authors
Citation
M. Feder et N. Merhav, RELATIONS BETWEEN ENTROPY AND ERROR-PROBABILITY, IEEE transactions on information theory, 40(1), 1994, pp. 259-266
Citations number
11
Categorie Soggetti
Information Science & Library Science","Engineering, Eletrical & Electronic
ISSN journal
00189448
Volume
40
Issue
1
Year of publication
1994
Pages
259 - 266
Database
ISI
SICI code
0018-9448(1994)40:1<259:RBEAE>2.0.ZU;2-N
Abstract
The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano's inequality provides a tight lower bound on the error probability in terms of the entropy, we derive a converse resul t-a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, bet ween the error probability for the maximum a posteriori (MAP) rule, an d the conditional entropy (equivocation), which is a useful uncertaint y measure in several applications. Combining this relation and the cla ssical channel coding theorem, we present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error p robability, is meaningful at all rates. This theorem is proved directl y for DMC's, and from this proof it is further concluded that for R gr eater-than-or-equal-to C the equivocation achieves its minimal value o f R - C at the rate of n1/2 where n is the block length.