Most previous work on trainable language generation has focused on two paradigms: (a) using a statistical model to rank a set of generated utterances, or (b) using statistics to i...
Dynamic languages typically allow programs to be written y high level of abstraction. But their dynamic nature makes it very hard to compile such languages, meaning that a price h...
Abstract. In this article, we propose the use of suffix arrays to efficiently implement n-gram language models with practically unlimited size n. This approach, which is used with ...
Paraphrasing methods recognize, generate, or extract phrases, sentences, or longer natural language expressions that convey almost the same information. Textual entailment methods...
Abstract. We present an approach to define template languages for generating syntactically correct code. In the first part of the paper, we define the syntax and semantics of a ...