Sciweavers

FOCS
2005
IEEE

On Learning Mixtures of Heavy-Tailed Distributions

14 years 5 months ago
On Learning Mixtures of Heavy-Tailed Distributions
We consider the problem of learning mixtures of arbitrary symmetric distributions. We formulate sufficient separation conditions and present a learning algorithm with provable guarantees for mixtures of distributions that satisfy these separation conditions. Our bounds are independent of the variances of the distributions; to the best of our knowledge, there were no previous algorithms known with provable learning guarantees for distributions having infinite variance and/or expectation. For Gaussians and log-concave distributions, our results match the best known sufficient separation conditions [1, 15]. Our algorithm requires a sample of size ˜O(dk), where d is the number of dimensions and k is the number of distributions in the mixture. We also show that for isotropic power-laws, exponential, and Gaussian distributions, our separation condition is optimal up to a constant factor.
Anirban Dasgupta, John E. Hopcroft, Jon M. Kleinbe
Added 24 Jun 2010
Updated 24 Jun 2010
Type Conference
Year 2005
Where FOCS
Authors Anirban Dasgupta, John E. Hopcroft, Jon M. Kleinberg, Mark Sandler
Comments (0)