Sciweavers

313 search results - page 4 / 63
» Boosting with Diverse Base Classifiers
Sort
View
DIS
2010
Springer
13 years 5 months ago
Speeding Up and Boosting Diverse Density Learning
Abstract. In multi-instance learning, each example is described by a bag of instances instead of a single feature vector. In this paper, we revisit the idea of performing multi-ins...
James R. Foulds, Eibe Frank
ADMA
2006
Springer
153views Data Mining» more  ADMA 2006»
13 years 9 months ago
An Effective Combination Based on Class-Wise Expertise of Diverse Classifiers for Predictive Toxicology Data Mining
This paper presents a study on the combination of different classifiers for toxicity prediction. Two combination operators for the Multiple-Classifier System definition are also pr...
Daniel Neagu, Gongde Guo, Shanshan Wang
PAKDD
2000
ACM
161views Data Mining» more  PAKDD 2000»
13 years 10 months ago
Adaptive Boosting for Spatial Functions with Unstable Driving Attributes
Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy by reducing a variance through manipu...
Aleksandar Lazarevic, Tim Fiez, Zoran Obradovic
CIARP
2007
Springer
13 years 11 months ago
Bagging with Asymmetric Costs for Misclassified and Correctly Classified Examples
Abstract. Diversity is a key characteristic to obtain advantages of combining predictors. In this paper, we propose a modification of bagging to explicitly trade off diversity and ...
Ricardo Ñanculef, Carlos Valle, Héct...
SSPR
2000
Springer
13 years 10 months ago
The Role of Combining Rules in Bagging and Boosting
To improve weak classifiers bagging and boosting could be used. These techniques are based on combining classifiers. Usually, a simple majority vote or a weighted majority vote are...
Marina Skurichina, Robert P. W. Duin