Sciweavers

UAI
2001

Expectation Propagation for approximate Bayesian inference

14 years 1 months ago
Expectation Propagation for approximate Bayesian inference
This paper presents a new deterministic approximation technique in Bayesian networks. This method, "Expectation Propagation," unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation, an extension of belief propagation in Bayesian networks. Loopy belief propagation, because it propagates exact belief states, is useful for a limited class of belief networks, such as those which are purely discrete. Expectation Propagation approximates the belief states by only retaining expectations, such as mean and variance, and iterates until these expectations are consistent throughout the network. This makes it applicable to hybrid networks with discrete and continuous nodes. Experiments with Gaussian mixture models show Expectation Propagation to be convincingly better than methods with similar computational cost: Laplace's method, variational Bayes, and Monte Carlo. Expectation Propagation also provides an efficien...
Thomas P. Minka
Added 31 Oct 2010
Updated 31 Oct 2010
Type Conference
Year 2001
Where UAI
Authors Thomas P. Minka
Comments (0)