Latent Dirichlet allocation (LDA) and other related topic models are increasingly popular tools for summarization and manifold discovery in discrete data. However, LDA does not ca...
The power and popularity of kernel methods stem in part from their ability to handle diverse forms of structured inputs, including vectors, graphs and strings. Recently, several m...
Darrin P. Lewis, Tony Jebara, William Stafford Nob...
We present an efficient method for maximizing energy functions with first and second order potentials, suitable for MAP labeling estimation problems that arise in undirected graph...
Many of today's best classification results are obtained by combining the responses of a set of base classifiers to produce an answer for the query. This paper explores a nov...
If appropriately used, prior knowledge can significantly improve the predictive accuracy of learning algorithms or reduce the amount of training data needed. In this paper we intr...
The Gaussian process latent variable model (GP-LVM) is a generative approach to nonlinear low dimensional embedding, that provides a smooth probabilistic mapping from latent to da...
We introduce the use of learned shaping rewards in reinforcement learning tasks, where an agent uses prior experience on a sequence of tasks to learn a portable predictor that est...
Kernel learning plays an important role in many machine learning tasks. However, algorithms for learning a kernel matrix often scale poorly, with running times that are cubic in t...
A multiclass classification problem can be reduced to a collection of binary problems with the aid of a coding matrix. The quality of the final solution, which is an ensemble of b...
We propose efficient particle smoothing methods for generalized state-spaces models. Particle smoothing is an expensive O(N2 ) algorithm, where N is the number of particles. We ov...
Mike Klaas, Mark Briers, Nando de Freitas, Arnaud ...