Performance of n-gram language models depends to a large extent on the amount of training text material available for building the models and the degree to which this text matches...
The intersection of tree transducer-based translation models with n-gram language models results in huge dynamic programs for machine translation decoding. We propose a multipass,...