Sciweavers

TIT
2008

On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions

14 years 14 days ago
On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions
Progressive encoding of a signal generally involves an estimation step, designed to reduce the entropy of the residual of an observation over the entropy of the observation itself. Oftentimes the conditional distributions of an observation, given already-encoded observations, are well fit within a class of symmetric and unimodal distributions (e.g. the two-sided geometric distributions in images of natural scenes, or symmetric Paretian distributions in models of financial data). It is common practice to choose an estimator that centers, or aligns, the modes of the conditional distributions, since it is common sense that this will minimize the entropy, and hence the coding cost of the residuals. But with the exception of a special case, there has been no rigorous proof. Here we prove that the entropy of an arbitrary mixture of symmetric and unimodal distributions is minimized by aligning the modes. The result generalizes to unimodal and rotation-invariant distributions in Rn . We illust...
Ting-Li Chen, Stuart Geman
Added 15 Dec 2010
Updated 15 Dec 2010
Type Journal
Year 2008
Where TIT
Authors Ting-Li Chen, Stuart Geman
Comments (0)