The problem of simultaneous feature extraction and selection, for classifier design, is considered. A new framework is proposed, based on boosting algorithms that can either 1) select existing features or 2) assemble a combination of these features. This framework is simple and mathematically sound, derived from the statistical view of boosting and Taylor series approximations in functional space. Unlike classical boosting, which is limited to linear feature combinations, the new algorithms support more sophisticated combinations of weak learners, such as “sums of products” or “products of sums”. This is shown to enable the design of fairly complex predictor structures with few weak learners in a fully automated manner, leading to faster and more accurate classifiers, based on more informative features. Extensive experiments on synthetic data, UCI datasets, object detection and scene recognition show that these predictors consistently lead to more accurate classifiers than ...
Mohammad J. Saberian, Nuno Vasconcelos