We study the learnability of sets in Rn under the Gaussian distribution, taking Gaussian surface area as the “complexity measure” of the sets being learned. Let CS denote the class of all (measurable) sets with surface area at most S. We first show that the class CS is learnable to any constant accuracy in time nO(S2 ) , even in the arbitrary noise (“agnostic”) model. Complementing this, we also show that any learning algorithm for CS information-theoretically requires 2Ω(S2 ) examples for learning to constant accuracy. These results together show that Gaussian surface area essentially characterizes the computational complexity of learning under the Gaussian distribution. Our approach yields several new learning results, including the following (all bounds are for learning to any constant accuracy): • The class of all convex sets can be agnostically learned in time 2 ˜O( √ n) (and we prove a 2Ω( √ n) lower bound for noise-free learning). This is the first subexpon...
Adam R. Klivans, Ryan O'Donnell, Rocco A. Servedio