Neural probabilistic parsers are attractive for their capability of automatic feature combination and small data sizes. A transition-based greedy neural parser has given better ac...
The intelligent personal assistant software such as the Apple’s Siri and Samsung’s S-Voice has been issued these days. This paper introduces a novel Spoken Language Understand...
We introduce into Bayesian decipherment a base distribution derived from similarities of word embeddings. We use Dirichlet multinomial regression (Mimno and McCallum, 2012) to lea...
Qing Dou, Ashish Vaswani, Kevin Knight, Chris Dyer
We describe the first version of the Media Frames Corpus: several thousand news articles on three policy issues, annotated in terms of media framing. We motivate framing as a phe...
Dallas Card, Amber E. Boydstun, Justin H. Gross, P...
Continuous space word embeddings learned from large, unstructured corpora have been shown to be effective at capturing semantic regularities in language. In this paper we replace ...
In this paper, we propose the concept of summary prior to define how much a sentence is appropriate to be selected into summary without consideration of its context. Different fr...
Ziqiang Cao, Furu Wei, Sujian Li, Wenjie Li, Ming ...
Prepositional phrases (PPs) express crucial information that knowledge base construction methods need to extract. However, PPs are a major source of syntactic ambiguity and still ...
In this paper, we present a test collection for mathematical information retrieval composed of real-life, researchlevel mathematical information needs. Topics and relevance judgem...
Distributional semantic models have trouble distinguishing strongly contrasting words (such as antonyms) from highly compatible ones (such as synonyms), because both kinds tend to...