GENERALIZED SOURCE-CODING THEOREMS AND HYPOTHESIS-TESTING - PART-I - INFORMATION MEASURES

Authors
Citation
Pn. Chen et F. Alajaji, GENERALIZED SOURCE-CODING THEOREMS AND HYPOTHESIS-TESTING - PART-I - INFORMATION MEASURES, Zhongguo gongcheng xuekan, 21(3), 1998, pp. 283-292
Citations number
11
Categorie Soggetti
Engineering
Journal title
Zhongguo gongcheng xuekan
ISSN journal
02533839 → ACNP
Volume
21
Issue
3
Year of publication
1998
Pages
283 - 292
Database
ISI
SICI code
0253-3839(1998)21:3<283:GSTAH->2.0.ZU;2-X
Abstract
Expressions for epsilon-entropy rate, epsilon-mutual information rate and epsilon-divergence rate are introduced. These quantities, which co nsist of the quantiles of the asymptotic information spectra, generali ze the inf/supentropy/information/divergence rates of Han and Verdu. T he algebraic properties of these information measures are rigorously a nalyzed, and examples illustrating their use in the computation of the E-capacity are presented. In Part II of this work, these measures are employed to prove general source coding theorems for block codes, and the general formula of the Neyman-Pearson hypothesis testing type-II error exponent subject to upper bounds on the type-I error probability .