We describe the g-factor which relates probability distributions on image features to distributions on the images themselves. The g-factor depends only on our choice of features and lattice quantization and is independent of the training image data. We illustrate the importance of the g-factor by analyzing Minimax Entropy Learning (MEL) [8] (which learns image distributions in terms of clique potentials corresponding to feature statistics). We first use our analysis of the g-factor to determine when the MEL clique potentials decouple for different features. Secondly, we show that MEL clique potentials can be computed analytically by approximating the g-factor. We support our analysis by computer simulations.
James M. Coughlan, Alan L. Yuille