Abstract. We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each o...
Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features, or weak classifiers, are redundant. By incorporating mutual infor...
LinLin Shen, Li Bai, Daniel Bardsley, Yangsheng Wa...
We consider geometric conditions on a labeled data set which guarantee that boosting algorithms work well when linear classifiers are used as weak learners. We start by providing ...
In order to understand AdaBoost’s dynamics, especially its ability to maximize margins, we derive an associated simplified nonlinear iterated map and analyze its behavior in lo...
Cynthia Rudin, Ingrid Daubechies, Robert E. Schapi...
Motion estimation for applications where appearance undergoes complex changes is challenging due to lack of an appropriate similarity function. In this paper, we propose to learn ...
Shaohua Kevin Zhou, Bogdan Georgescu, Dorin Comani...