The relation between the entropy of a discrete random variable and the
minimum attainable probability of error made in guessing its value is
examined. While Fano's inequality provides a tight lower bound on the
error probability in terms of the entropy, we derive a converse resul
t-a tight upper bound on the minimal error probability in terms of the
entropy. Both bounds are sharp, and can draw a relation, as well, bet
ween the error probability for the maximum a posteriori (MAP) rule, an
d the conditional entropy (equivocation), which is a useful uncertaint
y measure in several applications. Combining this relation and the cla
ssical channel coding theorem, we present a channel coding theorem for
the equivocation which, unlike the channel coding theorem for error p
robability, is meaningful at all rates. This theorem is proved directl
y for DMC's, and from this proof it is further concluded that for R gr
eater-than-or-equal-to C the equivocation achieves its minimal value o
f R - C at the rate of n1/2 where n is the block length.