Sciweavers

86 search results - page 5 / 18
» Bagging, Boosting, and C4.5
Sort
View
IJCAI
2003
13 years 9 months ago
Constructing Diverse Classifier Ensembles using Artificial Training Examples
Ensemble methods like bagging and boosting that combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the memb...
Prem Melville, Raymond J. Mooney
ICASSP
2008
IEEE
14 years 2 months ago
A weighted subspace approach for improving bagging performance
Bagging is an ensemble method that uses random resampling of a dataset to construct models. In classification scenarios, the random resampling procedure in bagging induces some c...
Qu-Tang Cai, Chun-Yi Peng, Chang-Shui Zhang
NECO
2006
157views more  NECO 2006»
13 years 7 months ago
Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression
The application of boosting technique to the regression problems has received relatively little attention in contrast to the research aimed at classification problems. This paper ...
Durga L. Shrestha, Dimitri P. Solomatine
ICDM
2003
IEEE
109views Data Mining» more  ICDM 2003»
14 years 27 days ago
Comparing Pure Parallel Ensemble Creation Techniques Against Bagging
We experimentally evaluate randomization-based approaches to creating an ensemble of decision-tree classifiers. Unlike methods related to boosting, all of the eight approaches co...
Lawrence O. Hall, Kevin W. Bowyer, Robert E. Banfi...
CIARP
2007
Springer
13 years 11 months ago
Bagging with Asymmetric Costs for Misclassified and Correctly Classified Examples
Abstract. Diversity is a key characteristic to obtain advantages of combining predictors. In this paper, we propose a modification of bagging to explicitly trade off diversity and ...
Ricardo Ñanculef, Carlos Valle, Héct...