Sciweavers

ACL
2012

Crowdsourcing Inference-Rule Evaluation

12 years 1 months ago
Crowdsourcing Inference-Rule Evaluation
The importance of inference rules to semantic applications has long been recognized and extensive work has been carried out to automatically acquire inference-rule resources. However, evaluating such resources has turned out to be a non-trivial task, slowing progress in the field. In this paper, we suggest a framework for evaluating inference-rule resources. Our framework simplifies a previously proposed “instance-based evaluation” method that involved substantial annotator training, making it suitable for crowdsourcing. We show that our method produces a large amount of annotations with high inter-annotator agreement for a low cost at a short period of time, without requiring training expert annotators.
Naomi Zeichner, Jonathan Berant, Ido Dagan
Added 29 Sep 2012
Updated 29 Sep 2012
Type Journal
Year 2012
Where ACL
Authors Naomi Zeichner, Jonathan Berant, Ido Dagan
Comments (0)