Sciweavers

JAIR
2006

Learning Sentence-internal Temporal Relations

13 years 11 months ago
Learning Sentence-internal Temporal Relations
In this paper we propose a data intensive approach for inferring sentence-internal temporal relations. Temporal inference is relevant for practical NLP applications which either extract or synthesize temporal information (e.g., summarisation, question answering). Our method bypasses the need for manual coding by exploiting the presence of markers like after, which overtly signal a temporal relation. We first show that models trained on main and subordinate clauses connected with a temporal marker achieve good performance on a pseudo-disambiguation task simulating temporal inference (during testing the temporal marker is treated as unseen and the models must select the right marker from a set of possible candidates). Secondly, we assess whether the proposed approach holds promise for the semi-automatic creation of temporal annotations. Specifically, we use a model trained on noisy and approximate data (i.e., main and subordinate clauses) to predict intra-sentential relations present in...
Maria Lapata, Alex Lascarides
Added 13 Dec 2010
Updated 13 Dec 2010
Type Journal
Year 2006
Where JAIR
Authors Maria Lapata, Alex Lascarides
Comments (0)