The Fano inequality gives a lower bound on the mutual information betw
een two random variables that take values on an M-element set, provide
d at least one of the random variables is equiprobable. We show severa
l simple lower bounds on mutual information which do not assume such a
restriction. In particular, this can be accomplished by replacing log
M with the infinite-order Renyi entropy in the Fano inequality. Appli
cations to hypothesis testing are exhibited along with bounds on mutua
l information in terms of the a priori and a posteriori error probabil
ities.