We propose an approach to lossy source coding, utilizing ideas from Gibbs sampling, simulated annealing, and Markov Chain Monte Carlo (MCMC). The idea is to sample a reconstructio...
We consider mixtures of parametric densities on the positive reals with a normalized generalized gamma process (Brix, 1999) as mixing measure. This class of mixtures encompasses t...
Raffaele Argiento, Alessandra Guglielmi, Antonio P...
Recently developed adaptive Markov chain Monte Carlo (MCMC) methods have been applied successfully to many problems in Bayesian statistics. Grapham is a new open source implementat...
As an example of the recently introduced concept of rate of innovation, signals that are linear combinations of a finite number of Diracs per unit time can be acquired by linear fi...
Background: With the explosion in data generated using microarray technology by different investigators working on similar experiments, it is of interest to combine results across...
Hyungwon Choi, Ronglai Shen, Arul M. Chinnaiyan, D...
This paper presents a Bayesian approach to learning the connectivity structure of a group of neurons from data on configuration frequencies. A major objective of the research is t...
The K-Means and EM algorithms are popular in clustering and mixture modeling due to their simplicity and ease of implementation. However, they have several significant limitations...
In simulation modeling and analysis, there are two situations where there is uncertainty about the number of parameters needed to specify a model. The first is in input modeling w...
We tackle the problem of object recognition using a Bayesian approach. A marked point process [1] is used as a prior model for the (unknown number of) objects. A sample is generat...
In models that define probabilities via energies, maximum likelihood learning typically involves using Markov Chain Monte Carlo to sample from the model’s distribution. If the ...