In order to let software programs gain full benefit from semi-structured web sources, wrapper programs must be built to provide a “machine-readable” view over them. A signific...
Given the large heterogeneity of the World Wide Web, using metadata on the search engines side seems to be a useful track for information retrieval. Though, because a manual quali...
Camille Prime-Claverie, Michel Beigbeder, Thierry ...
Collaborative filtering techniques are widely used by many E-commerce sites for recommendation purposes. Such techniques help customers by suggesting products to purchase using o...
We profile a system for search and analysis of largescale email archives. The system builds around four facets: Content-based search engine, statistical topic model, automaticall...
Distributed Perception Networks (DPN) are a MAS approach to large scale fusion of heterogeneous and noisy information. DPN agents can establish meaningful information filtering c...
Web services have emerged as a new paradigm that supports loosely-coupled distributed systems in service discovery and service execution. Next generation web services will evolve ...
In recent years, many research systems have been proposed to perform data extraction and automation tasks on Web sources. Since most of today’s Web sources are “human-readable...
Several algorithms based on link analysis have been developed to measure the importance of nodes on a graph such as pages on the World Wide Web. PageRank and HITS are the most pop...
In open environments like the Web, and open Multiagent and Peer2Peer systems, consent among the autonomous, self-interested knowledge sources and users very often cannot be establ...
Due to the enormous size of the web and low precision of user queries, finding the right information from the web can be difficult if not impossible. One approach that tries to ...