Sciweavers

301 search results - page 34 / 61
» Using Wikipedia for Automatic Word Sense Disambiguation
Sort
View
WWW
2008
ACM
14 years 8 months ago
Automatically refining the wikipedia infobox ontology
The combined efforts of human volunteers have recently extracted numerous facts from Wikipedia, storing them as machine-harvestable object-attribute-value triples in Wikipedia inf...
Fei Wu, Daniel S. Weld
CSL
2004
Springer
13 years 7 months ago
HyperLex: lexical cartography for information retrieval
This article describes an algorithm called HyperLex that is capable of automatically determining word uses in a textbase without recourse to a dictionary. The algorithm makes use ...
Jean Véronis
CIKM
2008
Springer
13 years 10 months ago
Learning to link with wikipedia
This paper describes how to automatically cross-reference documents with Wikipedia: the largest knowledge base ever known. It explains how machine learning can be used to identify...
David N. Milne, Ian H. Witten
ECAI
2000
Springer
14 years 8 days ago
Enriching very large ontologies using the WWW
This paper explores the possibility to exploit text on the world wide web in order to enrich the concepts in existing ontologies. First, a method to retrieve documents from the WWW...
Eneko Agirre, Olatz Ansa, Eduard H. Hovy, David Ma...
ACL
2010
13 years 6 months ago
Combining Orthogonal Monolingual and Multilingual Sources of Evidence for All Words WSD
Word Sense Disambiguation remains one of the most complex problems facing computational linguists to date. In this paper we present a system that combines evidence from a monoling...
Weiwei Guo, Mona Diab