Sciweavers

EMNLP
2004

Efficient Decoding for Statistical Machine Translation with a Fully Expanded WFST Model

14 years 1 months ago
Efficient Decoding for Statistical Machine Translation with a Fully Expanded WFST Model
This paper proposes a novel method to compile statistical models for machine translation to achieve efficient decoding. In our method, each statistical submodel is represented by a weighted finite-state transducer (WFST), and all of the submodels are expanded into a composition model beforehand. Furthermore, the ambiguity of the composition model is reduced by the statistics of hypotheses while decoding. The experimental results show that the proposed model representation drastically improves the efficiency of decoding compared to the dynamic composition of the submodels, which corresponds to conventional approaches.
Hajime Tsukada, Masaaki Nagata
Added 30 Oct 2010
Updated 30 Oct 2010
Type Conference
Year 2004
Where EMNLP
Authors Hajime Tsukada, Masaaki Nagata
Comments (0)