We construct a framework which allows an algorithm to turn the distributions produced by some boosting algorithms into polynomially smooth distributions (w.r.t. the PAC oracle's distribution), with minimal performance loss. Further, we explore the case of Freund and Schapire's AdaBoost algorithm, bounding its distributions to polynomially smooth. The main advantage of AdaBoost over other boosting techniques is that it is adaptive, i.e., it is able to take advantage of weak hypotheses that are more accurate than it was assumed a priori. We show that the feature of adaptiveness is preserved and improved by our technique. Our scheme allows the execution of AdaBoost in the on-line boosting mode (i.e., to perform boosting "by filtering"). Executed this way (and possessing the quality of smoothness), now AdaBoost may be efficiently applied to a wider range of learning problems than before. In particular, we demonstrate AdaBoost's application to the task of DNF learn...
Nader H. Bshouty, Dmitry Gavinsky