Sciweavers

ACL
2009

Learning Semantic Correspondences with Less Supervision

13 years 9 months ago
Learning Semantic Correspondences with Less Supervision
A central problem in grounded language acquisition is learning the correspondences between a rich world state and a stream of text which references that world state. To deal with the high degree of ambiguity present in this setting, we present a generative model that simultaneously segments the text into utterances and maps each utterance to a meaning representation grounded in the world state. We show that our model generalizes across three domains of increasing difficulty--Robocup sportscasting, weather forecasts (a new domain), and NFL recaps.
Percy Liang, Michael I. Jordan, Dan Klein
Added 16 Feb 2011
Updated 16 Feb 2011
Type Journal
Year 2009
Where ACL
Authors Percy Liang, Michael I. Jordan, Dan Klein
Comments (0)