Sciweavers

COLT
1999
Springer

Boosting as Entropy Projection

14 years 4 months ago
Boosting as Entropy Projection
We consider the AdaBoost procedure for boosting weak learners. In AdaBoost, a key step is choosing a new distribution on the training examples based on the old distribution and the mistakes made by the present weak hypothesis. We show how AdaBoost’s choice of the new distribution can be seen as an approximate solution to the following problem: Find a new distribution that is closest to the old distribution subject to the constraint that the new distribution is orthogonal to the vector of mistakes of the current weak hypothesis. The distance (or divergence) between distributions is measured by the relative entropy. Alternatively, we could say that AdaBoost approximately projects the distribution vector onto a hyperplane defined by the mistake vector. We show that this new view of AdaBoost as an entropy projection is dual to the usual view of AdaBoost as minimizing the normalization factors of the updated distributions.
Jyrki Kivinen, Manfred K. Warmuth
Added 03 Aug 2010
Updated 03 Aug 2010
Type Conference
Year 1999
Where COLT
Authors Jyrki Kivinen, Manfred K. Warmuth
Comments (0)