R. Zamir et M. Feder, A GENERALIZATION OF THE ENTROPY POWER INEQUALITY WITH APPLICATIONS, IEEE transactions on information theory, 39(5), 1993, pp. 1723-1728
We prove the following generalization of the Entropy Power Inequality:
h(A (x) under bar) greater than or equal to h(A (x) over tilde<under
bar> where h(.) denotes (joint-) differential-entropy, (x) under bar =
x(1)...x(n) is a random vector with independent components, (x) over
tilde<under bar> = (x) over tilde(1)...(x) over tilde(n), is a Gaussia
n vector with independent components such that h((x) over tilde)(i) =
h(x(i)), i = 1...n, and A is any matrix. This generalization of the en
tropy-power inequality is applied to show that a non-Gaussian vector w
ith independent components becomes ''closer'' to Gaussianity after a l
inear transformation, where the distance to Gaussianity is measured by
the information divergence. Another application is a lower bound, gre
ater than zero, for the mutual-information between nonoverlapping spec
tral components of a non-Gaussian white process. Finally, we describe
a dual generalization of the Fisher Information Inequality.