Sciweavers

WBIR
2010
SPRINGER

Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration

13 years 10 months ago
Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration
Abstract. Mutual information (MI) was introduced for use in multimodal image registration over a decade ago [1,2,3,4]. The MI between two images is based on their marginal and joint/conditional entropies. The most common versions of entropy used to compute MI are the Shannon and differential entropies; however, many other definitions of entropy have been proposed as competitors. In this article, we show how to construct normalized versions of MI using any of these definitions of entropy. The resulting similarity measures are analogous to normalized mutual information (NMI), entropy correlation coefficient (ECC), and symmetric uncertainty (SU), which have all been shown to be superior to MI in a variety of situations. We use publicly available CT, PET, and MR brain images1 with known ground truth transformations to evaluate the performance of the normalized measures for rigid multimodal registration. Results show that for a number of different definitions of entropy, the proposed normal...
Nathan D. Cahill
Added 15 Feb 2011
Updated 15 Feb 2011
Type Journal
Year 2010
Where WBIR
Authors Nathan D. Cahill
Comments (0)