In this paper we exploit normalized mutual information
for the nonrigid registration of multimodal images.
Rather than assuming that image statistics are spatially
stationary, as often done in traditional informationtheoretic
methods, we take into account the spatial variability
through a weighted combination of global normalized
mutual information and local matching statistics.
Spatial relationships are incorporated into the registration
criterion by adaptively adjusting the weight according
to the strength of local cues. With a continuous
representation of images and Parzen window estimators,
we have developed closed-form expressions of the firstorder
variation with respect to any general, nonparametric,
infinite-dimensional deformation of the image
domain. To characterize the performance of the proposed
approach, synthetic phantoms, simulated MRIs,
and clinical data are used in a validation study. The
results suggest that the augmented normalized mutual
information...