Sciweavers

CORR
2011
Springer

Mutual Information, Relative Entropy, and Estimation in the Poisson Channel

13 years 6 months ago
Mutual Information, Relative Entropy, and Estimation in the Poisson Channel
Let X be a non-negative random variable and let the conditional distribution of a random variable Y , given X, be Poisson(γ · X), for a parameter γ ≥ 0. We identify a natural loss function such that: • The derivative of the mutual information between X and Y with respect to γ is equal to the minimum mean loss in estimating X based on Y , regardless of the distribution of X. • When X ∼ P is estimated based on Y by a mismatched estimator that would have minimized the expected loss had X ∼ Q, the integral over all values of γ of the excess mean loss is equal to the relative entropy between P and Q. For a continuous time setting where XT = {Xt, 0 ≤ t ≤ T} is a non-negative stochastic process and the conditional law of Y T = {Yt, 0 ≤ t ≤ T}, given XT , is that of a non-homogeneous Poisson process with intensity function γ · XT , under the same loss function: • The minimum mean loss in causal filtering when γ = γ0 is equal to the expected value of the minimum m...
Rami Atar, Tsachy Weissman
Added 13 May 2011
Updated 13 May 2011
Type Journal
Year 2011
Where CORR
Authors Rami Atar, Tsachy Weissman
Comments (0)