Confidence measures for the generalization error are crucial when small training samples are used to construct classifiers. A common approach is to estimate the generalization err...
The paper introduces an AND/OR importance sampling scheme for probabilistic graphical models. In contrast to conventional importance sampling, AND/OR importance sampling caches sa...
Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform upd...
Umut A. Acar, Alexander T. Ihler, Ramgopal R. Mett...
Graphical models are usually learned without regard to the cost of doing inference with them. As a result, even if a good model is learned, it may perform poorly at prediction, be...
In this paper, we develop the continuous time dynamic topic model (cDTM). The cDTM is a dynamic topic model that uses Brownian motion to model the latent topics through a sequenti...
Bounded policy iteration is an approach to solving infinitehorizon POMDPs that represents policies as stochastic finitestate controllers and iteratively improves a controller by a...
Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvant...
While known algorithms for sensitivity analysis and parameter tuning in probabilistic networks have a running time that is exponential in the size of the network, the exact comput...