Hidden Markov Models (HMMs) model sequential data in many fields such as text/speech processing and biosignal analysis. Active learning algorithms learn faster and/or better by cl...
Dimensionality reduction is the problem of finding a low-dimensional representation of highdimensional input data. This paper examines the case where additional information is kno...
We present a novel unsupervised learning scheme that simultaneously clusters variables of several types (e.g., documents, words and authors) based on pairwise interactions between...
We consider reinforcement learning in systems with unknown dynamics. Algorithms such as E3 (Kearns and Singh, 2002) learn near-optimal policies by using "exploration policies...
We present a unified duality view of several recently emerged spectral methods for nonlinear dimensionality reduction, including Isomap, locally linear embedding, Laplacian eigenm...
We present a new unsupervised algorithm for training structured predictors that is discriminative, convex, and avoids the use of EM. The idea is to formulate an unsupervised versi...
Linli Xu, Dana F. Wilkinson, Finnegan Southey, Dal...
Most work on preference learning has focused on pairwise preferences or rankings over individual items. In this paper, we present a method for learning preferences over sets of it...
Semi-naive Bayesian classifiers seek to retain the numerous strengths of naive Bayes while reducing error by weakening the attribute independence assumption. Backwards Sequential ...
We describe a statistical approach to software debugging in the presence of multiple bugs. Due to sparse sampling issues and complex interaction between program predicates, many g...
Alice X. Zheng, Michael I. Jordan, Ben Liblit, May...