Variational methods for model comparison have become popular in the neural computing/machine learning literature. In this paper we explore their application to the Bayesian analysis of mixtures of Gaussians. We also consider how the Deviance Information Criterion, or DIC, devised by Spiegelhalter et al. (2002), can be extended to these types of model by exploiting the use of variational approximations. We illustrate the results of using variational methods for model selection and the calculation of a DIC using real and simulated data. Using the variational approximation, one can simultaneously estimate component parameters and the model complexity. It turns out that, if one starts off with a large number of components, superfluous components are eliminated as the method converges to a solution, thereby leading to an automatic choice of model complexity, the appropriateness of which is reflected in the DIC values.
Clare A. McGrory, D. M. Titterington