Sciweavers

CSDA
2007

Variational approximations in Bayesian model selection for finite mixture distributions

13 years 11 months ago
Variational approximations in Bayesian model selection for finite mixture distributions
Variational methods for model comparison have become popular in the neural computing/machine learning literature. In this paper we explore their application to the Bayesian analysis of mixtures of Gaussians. We also consider how the Deviance Information Criterion, or DIC, devised by Spiegelhalter et al. (2002), can be extended to these types of model by exploiting the use of variational approximations. We illustrate the results of using variational methods for model selection and the calculation of a DIC using real and simulated data. Using the variational approximation, one can simultaneously estimate component parameters and the model complexity. It turns out that, if one starts off with a large number of components, superfluous components are eliminated as the method converges to a solution, thereby leading to an automatic choice of model complexity, the appropriateness of which is reflected in the DIC values.
Clare A. McGrory, D. M. Titterington
Added 13 Dec 2010
Updated 13 Dec 2010
Type Journal
Year 2007
Where CSDA
Authors Clare A. McGrory, D. M. Titterington
Comments (0)