R. Zamir, A PROOF OF THE FISHER INFORMATION INEQUALITY VIA A DATA-PROCESSING ARGUMENT, IEEE transactions on information theory, 44(3), 1998, pp. 1246-1250
Citations number
16
Categorie Soggetti
Computer Science Information Systems","Engineering, Eletrical & Electronic","Computer Science Information Systems
The Fisher information J(X) of a random variable X under a translation
parameter appears in information theory in the classical proof of the
Entropy-Power Inequality (EPI). It enters the proof of the EPI via th
e De-Bruijn identity, where it measures the variation of the different
ial entropy under a Gaussian perturbation, and via the convolution ine
quality J(X + Y)(-1) greater than or equal to J(X)(-1) + J(Y)(-1) (for
independent X and Y), known as the Fisher Information Inequality (FII
). The FII, is proved in the literature directly, in a rather involved
way. We give an alternative derivation of the FII, as a simple conseq
uence of a ''data-processing inequality'' for the Cramer-Rao lower bou
nd on parameter estimation.