Sciweavers

APPROX
2007
Springer

Worst-Case to Average-Case Reductions Revisited

14 years 6 months ago
Worst-Case to Average-Case Reductions Revisited
Abstract. A fundamental goal of computational complexity (and foundations of cryptography) is to find a polynomial-time samplable distribution (e.g., the uniform distribution) and a language in NTIME(f(n)) for some polynomial function f, such that the language is hard on the average with respect to this distribution, given that NP is worst-case hard (i.e. NP = P, or NP ⊆ BPP). Currently, no such result is known even if we relax the language to be in nondeterministic sub-exponential time. There has been a long line of research trying to explain our failure in proving such worst-case/average-case connections [FF93,Vio03,BT03,AGGM06]. The bottom line of this research is essentially that (under plausible assumptions) non-adaptive Turing reductions cannot prove such results. In this paper we revisit the problem. Our first observation is that the above mentioned negative arguments extend to a non-standard notion of average-case complexity, in which the distribution on the inputs with res...
Dan Gutfreund, Amnon Ta-Shma
Added 07 Jun 2010
Updated 07 Jun 2010
Type Conference
Year 2007
Where APPROX
Authors Dan Gutfreund, Amnon Ta-Shma
Comments (0)