Pn. Chen et F. Alajaji, GENERALIZED SOURCE-CODING THEOREMS AND HYPOTHESIS-TESTING - PART-I - INFORMATION MEASURES, Zhongguo gongcheng xuekan, 21(3), 1998, pp. 283-292
Expressions for epsilon-entropy rate, epsilon-mutual information rate
and epsilon-divergence rate are introduced. These quantities, which co
nsist of the quantiles of the asymptotic information spectra, generali
ze the inf/supentropy/information/divergence rates of Han and Verdu. T
he algebraic properties of these information measures are rigorously a
nalyzed, and examples illustrating their use in the computation of the
E-capacity are presented. In Part II of this work, these measures are
employed to prove general source coding theorems for block codes, and
the general formula of the Neyman-Pearson hypothesis testing type-II
error exponent subject to upper bounds on the type-I error probability
.