Sciweavers

EMNLP
2008

Complexity of Finding the BLEU-optimal Hypothesis in a Confusion Network

14 years 27 days ago
Complexity of Finding the BLEU-optimal Hypothesis in a Confusion Network
Confusion networks are a simple representation of multiple speech recognition or translation hypotheses in a machine translation system. A typical operation on a confusion network is to find the path which minimizes or maximizes a certain evaluation metric. In this article, we show that this problem is generally NP-hard for the popular BLEU metric, as well as for smaller variants of BLEU. This also holds for more complex representations like generic word graphs. In addition, we give an efficient polynomial-time algorithm to calculate unigram BLEU on confusion networks, but show that even small generalizations of this data structure render the problem to be NP-hard again. Since finding the optimal solution is thus not always feasible, we introduce an approximating algorithm based on a multi-stack decoder, which finds a (not necessarily optimal) solution for n-gram BLEU in polynomial time.
Gregor Leusch, Evgeny Matusov, Hermann Ney
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2008
Where EMNLP
Authors Gregor Leusch, Evgeny Matusov, Hermann Ney
Comments (0)