Text analysis tools are nowadays required to process increasingly large corpora which are often organized as small files (abstracts, news articles, etc). Cloud computing offers a ...
Scientists working in eScience environments often use workflows to carry out their computations. Since the workflows evolve as the research itself evolves, these workflows can be ...
Eran Chinthaka Withana, Beth Plale, Roger S. Barga...
Hadoop is a reference software framework supporting the Map/Reduce programming model. It relies on the Hadoop Distributed File System (HDFS) as its primary storage system. Althoug...
The recent emergence of clouds is making the vision of utility computing realizable, i.e. computing resources and services can be delivered, utilized, and paid for as utilities su...
Cloud computing has emerged as a new approach to large scale computing and is attracting a lot of attention from the scientific and research computing communities. Despite its gro...
Modern scientific experiments can generate hundreds of gigabytes to terabytes or even petabytes of data that may furthermore be maintained in large numbers of relatively small fil...
Wantao Liu, Brian Tieman, Rajkumar Kettimuthu, Ian...
Advances in internetworking technology and the decreasing cost-performance ratio of commodity computing components have enabled Volunteer Computing (VC). VC platforms aggregate te...
Bruno Donassolo, Henri Casanova, Arnaud Legrand, P...
Today, the BitTorrent Peer-to-Peer file-sharing network is one of the largest Internet applications--it generates massive traffic volumes, it is deployed in thousands of independe...
A significant open issue in cloud computing is performance. Few, if any, cloud providers or technologies offer quantitative performance guarantees. Regardless of the potential adv...
Zach Hill, Jie Li, Ming Mao, Arkaitz Ruiz-Alvarez,...
Voluntary Computing systems or Desktop Grids (DGs) enable sharing of commodity computing resources across the globe and have gained tremendous popularity among scientific research...