: Bagging (Bootstrap Aggregating) has been proved to be a useful, effective and simple ensemble learning methodology. In generic bagging methods, all the classifiers which are trai...
This paper addresses the supervised learning in which the class membership of training data are subject to uncertainty. This problem is tackled in the framework of the Dempster-Sha...
In classifier combining, one tries to fuse the information that is given by a set of base classifiers. In such a process, one of the difficulties is how to deal with the variabilit...
Elzbieta Pekalska, Robert P. W. Duin, Marina Skuri...
Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the comb...
This paper presents a strategy to improve the AdaBoost algorithm with a quadratic combination of base classifiers. We observe that learning this combination is necessary to get be...
- An ideal ensemble is composed of base classifiers that perform well and that have minimal overlap in their errors. Eliminating classifiers from an ensemble based on a criterion t...
Ensemble algorithms can improve the performance of a given learning algorithm through the combination of multiple base classifiers into an ensemble. In this paper, the idea of usin...
We propose a well-founded method of ranking a pool of m trained classifiers by their suitability for the current input of n instances. It can be used when dynamically selecting a s...
In a multiple classifier system, dynamic selection (DS) has been used successfully to choose only the best subset of classifiers to recognize the test samples. Dos Santos et al...
Paulo Rodrigo Cavalin, Robert Sabourin, Ching Y. S...
Selecting a set of good and diverse base classifiers is essential for building multiple classifier systems. However, almost all commonly used procedures for selecting such base cla...