AdaBoost is a well known, effective technique for increasing the accuracy of learning algorithms. However, it has the potential to overfit the training set because its objective i...
We present MBoost, a novel extension to AdaBoost that extends boosting to use multiple weak learners explicitly, and provides robustness to learning models that overfit or are po...
Stability has been explored to study the performance of learning algorithms in recent years and it has been shown that stability is sufficient for generalization and is sufficient ...
There are two main approaches to the problem of gender classification, Support Vector Machines (SVMs) and Adaboost learning methods, of which SVMs are better in correct rate but ar...
We discuss two learning algorithms for text filtering: modified Rocchio and a boosting algorithm called AdaBoost. We show how both algorithms can be adapted to maximize any gene...
We propose to use AdaBoost to efficiently learn classifiers over very large and possibly distributed data sets that cannot fit into main memory, as well as on-line learning wher...
We consider the AdaBoost procedure for boosting weak learners. In AdaBoost, a key step is choosing a new distribution on the training examples based on the old distribution and th...
Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features, or weak classifiers, are redundant. By incorporating mutual infor...
LinLin Shen, Li Bai, Daniel Bardsley, Yangsheng Wa...
A new learning strategy for object detection is presented.
The proposed scheme forgoes the need to train a collection
of detectors dedicated to homogeneous families of poses,
an...
Karim Ali, Francois Fleuret, David Hasler and Pasc...
— Although AdaBoost has achieved great success, it still suffers from following problems: (1) the training process could be unmanageable when the number of features is extremely ...
Hongbo Deng, Jianke Zhu, Michael R. Lyu, Irwin Kin...