Sciweavers

PODS
2006
ACM

Randomized computations on large data sets: tight lower bounds

14 years 11 months ago
Randomized computations on large data sets: tight lower bounds
We study the randomized version of a computation model (introduced in [9, 10]) that restricts random access to external memory and internal memory space. Essentially, this model can be viewed as a powerful version of a data stream model that puts no cost on sequential scans of external memory (as other models for data streams) and, in addition, (like other external memory models, but unlike streaming models), admits several large external memory devices that can be read and written to in parallel. We obtain tight lower bounds for the decision problems set equality, multiset equality, and checksort. More precisely, we show that any randomized one-sided-error bounded Monte Carlo algorithm for these problems must perform (logN) random accesses to external memory devices, provided that the internal memory size is at most O( 4 N/logN), where N denotes the size of the input data. From the lower bound on the set equality problem we can infer lower bounds on the worst case data complexity of...
André Hernich, Martin Grohe, Nicole Schweik
Added 08 Dec 2009
Updated 08 Dec 2009
Type Conference
Year 2006
Where PODS
Authors André Hernich, Martin Grohe, Nicole Schweikardt
Comments (0)