Sciweavers

TIT
1998

On Characterization of Entropy Function via Information Inequalities

13 years 11 months ago
On Characterization of Entropy Function via Information Inequalities
—Given n discrete random variables = fX1;111; Xng, associated with any subset of f1; 2; 111; ng, there is a joint entropy H(X ) where X = fXi:i 2 g. This can be viewed as a function defined on 2f1;2;111;ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual informations implies that this function has the following property: for any two subsets and of f1; 2; 111;ng H ( ) + H ( )  H ( [ ) + H ( \ ): These properties are the so-called basic information inequalities of Shannon’s information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2n 0 1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground ...
Zhen Zhang, Raymond W. Yeung
Added 23 Dec 2010
Updated 23 Dec 2010
Type Journal
Year 1998
Where TIT
Authors Zhen Zhang, Raymond W. Yeung
Comments (0)