Sciweavers

TIT
2010

Mismatched estimation and relative entropy

13 years 7 months ago
Mismatched estimation and relative entropy
A random variable with distribution P is observed in Gaussian noise and is estimated by a minimum meansquare estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P Q). This representation of relative entropy can be generalized to non realvalued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
Sergio Verdú
Added 22 May 2011
Updated 22 May 2011
Type Journal
Year 2010
Where TIT
Authors Sergio Verdú
Comments (0)