Sciweavers

EMNLP
2011

Lateen EM: Unsupervised Training with Multiple Objectives, Applied to Dependency Grammar Induction

13 years 5 days ago
Lateen EM: Unsupervised Training with Multiple Objectives, Applied to Dependency Grammar Induction
We present new training methods that aim to mitigate local optima and slow convergence in unsupervised training by using additional imperfect objectives. In its simplest form, lateen EM alternates between the two objectives of ordinary “soft” and “hard” expectation maximization (EM) algorithms. Switching objectives when stuck can help escape local optima. We find that applying a single such alternation already yields state-of-the-art results for English dependency grammar induction. More elaborate lateen strategies track both objectives, with each validating the moves proposed by the other. Disagreements can signal earlier opportunities to switch or terminate, saving iterations. De-emphasizing fixed points in these ways eliminates some guesswork from tuning EM. An evaluation against a suite of unsupervised dependency parsing tasks, for a variety of languages, showed that lateen strategies significantly speed up training of both EM algorithms, and improve accuracy for hard E...
Valentin I. Spitkovsky, Hiyan Alshawi, Daniel Jura
Added 20 Dec 2011
Updated 20 Dec 2011
Type Journal
Year 2011
Where EMNLP
Authors Valentin I. Spitkovsky, Hiyan Alshawi, Daniel Jurafsky
Comments (0)