In this paper we use the cumulative distribution of a random variable to define the information content in it and use it to develop a novel measure of information that parallels Shannon entropy, which we dub cumulative residual entropy (CRE). The salient features of CRE are, (1) it is more general than the Shannon Entropy in that its definition is valid in the continuous and discrete domains, (2) it possess more general mathematical properties than the Shannon entropy and (3) it can be easily computed from sample data and these computations asymptotically converge to the true values. Based on CRE, we define the cross-CRE (CCRE) between two random variables, and apply it to solve the uni- & multi-modal image alignment problem for parameterized (rigid, affine and projective) transformations. The key strengths of the CCRE over using the now popular mutual information method (based on Shannon's entropy) are that the former has significantly larger noise immunity and a much larger...
Fei Wang, Baba C. Vemuri, Murali Rao, Yunmei Chen