Sciweavers

45 search results - page 4 / 9
» Boosting in the Limit: Maximizing the Margin of Learned Ense...
Sort
View
IJCAI
2003
13 years 9 months ago
Constructing Diverse Classifier Ensembles using Artificial Training Examples
Ensemble methods like bagging and boosting that combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the memb...
Prem Melville, Raymond J. Mooney
ICDM
2009
IEEE
199views Data Mining» more  ICDM 2009»
14 years 2 months ago
Active Learning with Adaptive Heterogeneous Ensembles
—One common approach to active learning is to iteratively train a single classifier by choosing data points based on its uncertainty, but it is nontrivial to design uncertainty ...
Zhenyu Lu, Xindong Wu, Josh Bongard
COLT
2001
Springer
14 years 6 days ago
Limitations of Learning via Embeddings in Euclidean Half-Spaces
The notion of embedding a class of dichotomies in a class of linear half spaces is central to the support vector machines paradigm. We examine the question of determining the mini...
Shai Ben-David, Nadav Eiron, Hans-Ulrich Simon
COLT
2008
Springer
13 years 9 months ago
On the Equivalence of Weak Learnability and Linear Separability: New Relaxations and Efficient Boosting Algorithms
Boosting algorithms build highly accurate prediction mechanisms from a collection of lowaccuracy predictors. To do so, they employ the notion of weak-learnability. The starting po...
Shai Shalev-Shwartz, Yoram Singer
MCS
2002
Springer
13 years 7 months ago
Boosting and Classification of Electronic Nose Data
Abstract. Boosting methods are known to improve generalization performances of learning algorithms reducing both bias and variance or enlarging the margin of the resulting multi-cl...
Francesco Masulli, Matteo Pardo, Giorgio Sbervegli...