Renyi's divergence and entropy rates for finite alphabet Markov sources

Citation
Z. Rached et al., Renyi's divergence and entropy rates for finite alphabet Markov sources, IEEE INFO T, 47(4), 2001, pp. 1553-1561
Citations number
23
Categorie Soggetti
Information Tecnology & Communication Systems
Journal title
IEEE TRANSACTIONS ON INFORMATION THEORY
ISSN journal
00189448 → ACNP
Volume
47
Issue
4
Year of publication
2001
Pages
1553 - 1561
Database
ISI
SICI code
0018-9448(200105)47:4<1553:RDAERF>2.0.ZU;2-Y
Abstract
In this work, we examine the existence and the computation of the Renyi div ergence rate, lim(n --> infinity) 1/n D-alpha (p((n)) parallel to q((n))), between two time-invariant finite-alphabet Markov sources of arbitrary orde r and arbitrary initial distributions described by the probability distribu tions p((n)) and q((n)), respectively. This yields a generalization of a re sult of Nemetz where he assumed that the initial probabilities under p((n)) and q((n)) are strictly positive. The main tools used to obtain the Renyi divergence rate are the theory of nonnegative matrices and Perron-Frobenius theory. We also provide numerical examples and investigate the limits of t he Renyi divergence rate as alpha --> 1 and as alpha down arrow 0, Similarl y, we provide a formula for the Renyi entropy rate lim(n --> infinity) 1/n H-alpha(p((n))) of Markov sources and examine its limits as alpha --> 1 and as alpha down arrow 0, Finally, we briefly provide an application to sourc e coding.