This paper proposes a new randomized strategy for adaptive MCMC using Bayesian optimization. This approach applies to nondifferentiable objective functions and trades off explor...
We address the problem of learning the parameters in graphical models when inference is intractable. A common strategy in this case is to replace the partition function with its B...
We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the p-norm of the parameters. We discuss several implications ...
A common approach in machine learning is to use a large amount of labeled data to train a model. Usually this model can then only be used to classify data in the same feature spac...
Background: Discovering the genetic basis of common genetic diseases in the human genome represents a public health issue. However, the dimensionality of the genetic data (up to 1...
Raphael Mourad, Christine Sinoquet, Philippe Leray
With the increasing popularity of largescale probabilistic graphical models, even "lightweight" approximate inference methods are becoming infeasible. Fortunately, often...
Quantum systems are promising candidates of future computing and information processing devices. In a large system, information about the quantum states and processes may be incomp...
We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge b...
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate stat...
We consider the question of how well a given distribution can be approximated with probabilistic graphical models. We introduce a new parameter, effective treewidth, that captures...