We propose a definition for the entropy of capacities defined on lattices. Classical capacities are monotone set functions and can be seen as a generalization of probability mea...
We consider the problem of computing information theoretic functions such as entropy on a data stream, using sublinear space. Our first result deals with a measure we call the &quo...
When using the ordered weighted average operator, it can happen that one wants to optimize the variability (measured by the entropy (maximal) or by the variance (minimal)) of the ...
— While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon’s entropy power inequality (EPI) seems...
Let X be a discrete random variable with support S and f : S S be a bijection. Then it is wellknown that the entropy of X is the same as the entropy of f(X). This entropy preserva...
In this paper, both non-mixing and mixing local minima of the entropy are analyzed from the viewpoint of blind source separation (BSS); they correspond respectively to acceptable a...
We investigate the complexity of the following computational problem: Polynomial Entropy Approximation (PEA): Given a low-degree polynomial mapping p : Fn Fm , where F is a finite...
Zeev Dvir, Dan Gutfreund, Guy N. Rothblum, Salil P...
Information estimates such as the "direct method" of Strong et al. (1998) sidestep the difficult problem of estimating the joint distribution of response and stimulus by...
Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodyna...
A number of measures have been proposed for the direction of the coupling between two time series, and transfer entropy (TE) has been found in recent studies to perform consistentl...