We propose a novel method of dimensionality reduction for supervised learning. Given a regression or classification problem in which we wish to predict a variable Y from an expla...
Kenji Fukumizu, Francis R. Bach, Michael I. Jordan
In this work we propose a new supervised deformable model that generalizes the classical contour-based snake. This model is defined to deform in a feature space generated by a se...
This work investigates supervised word alignment methods that exploit inversion transduction grammar (ITG) constraints. We consider maximum margin and conditional likelihood objec...
Aria Haghighi, John Blitzer, John DeNero, Dan Klei...
Feature selection aims to reduce dimensionality for building comprehensible learning models with good generalization performance. Feature selection algorithms are largely studied ...
Inducing a grammar from text has proven to be a notoriously challenging learning task despite decades of research. The primary reason for its difficulty is that in order to induce...