Sciweavers

EMNLP
2008

Coarse-to-Fine Syntactic Machine Translation using Language Projections

14 years 1 months ago
Coarse-to-Fine Syntactic Machine Translation using Language Projections
The intersection of tree transducer-based translation models with n-gram language models results in huge dynamic programs for machine translation decoding. We propose a multipass, coarse-to-fine approach in which the language model complexity is incrementally introduced. In contrast to previous orderbased bigram-to-trigram approaches, we focus on encoding-based methods, which use a clustered encoding of the target language. Across various encoding schemes, and for multiple language pairs, we show speed-ups of up to 50 times over single-pass decoding while improving BLEU score. Moreover, our entire decoding cascade for trigram language models is faster than the corresponding bigram pass alone of a bigram-to-trigram decoder.
Slav Petrov, Aria Haghighi, Dan Klein
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2008
Where EMNLP
Authors Slav Petrov, Aria Haghighi, Dan Klein
Comments (0)