Sciweavers

EMNLP
2006

Competitive generative models with structure learning for NLP classification tasks

14 years 7 days ago
Competitive generative models with structure learning for NLP classification tasks
In this paper we show that generative models are competitive with and sometimes superior to discriminative models, when both kinds of models are allowed to learn structures that are optimal for discrimination. In particular, we compare Bayesian Networks and Conditional loglinear models on two NLP tasks. We observe that when the structure of the generative model encodes very strong independence assumptions (a la Naive Bayes), a discriminative model is superior, but when the generative model is allowed to weaken these independence assumptions via learning a more complex structure, it can achieve very similar or better performance than a corresponding discriminative model. In addition, as structure learning for generative models is far more efficient, they may be preferable for some tasks.
Kristina Toutanova
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2006
Where EMNLP
Authors Kristina Toutanova
Comments (0)