: One of the fundamental limits to high-performance, high-reliability file systems is memory's vulnerability to system crashes. Because memory is viewed as unsafe, systems per...
Peter M. Chen, Wee Teck Ng, Subhachandra Chandra, ...
Most Internet services rely on the traditional client-server model, where the quality of services usually depends on the performance of those servers. In this paper, we propose a ...
Chei-Yol Kim, Sung-Hoon Sohn, Baik-Song Ahn, Gyu-I...
This paper presents a new carrier sensing mechanism called DVCS (Directional Virtual Carrier Sensing) for wireless communication using directional antennas. DVCS does not require ...
Mineo Takai, Jay Martin, Rajive Bagrodia, Aifeng R...
Hadoop is a reference software framework supporting the Map/Reduce programming model. It relies on the Hadoop Distributed File System (HDFS) as its primary storage system. Althoug...
The idealdistributed file system wouldprovide all its userswith coherent,shared access tothe samesetoffiles,yetwould be arbitrarily scalable to provide more storage space and hi...
Chandramohan A. Thekkath, Timothy Mann, Edward K. ...