Sciweavers

86 search results - page 6 / 18
» Bagging, Boosting, and C4.5
Sort
View
ML
2000
ACM
13 years 7 months ago
Randomizing Outputs to Increase Prediction Accuracy
Bagging and boosting reduce error by changing both the inputs and outputs to form perturbed training sets, grow predictors on these perturbed training sets and combine them. A que...
Leo Breiman
DAGM
2011
Springer
12 years 7 months ago
Multiple Instance Boosting for Face Recognition in Videos
For face recognition from video streams often cues such as transcripts, subtitles or on-screen text are available. This information could be very valuable for improving the recogni...
Paul Wohlhart, Martin Köstinger, Peter M. Rot...
MCS
2002
Springer
13 years 7 months ago
Distributed Pasting of Small Votes
Bagging and boosting are two popular ensemble methods that achieve better accuracy than a single classifier. These techniques have limitations on massive datasets, as the size of t...
Nitesh V. Chawla, Lawrence O. Hall, Kevin W. Bowye...
ECML
2004
Springer
14 years 1 months ago
A Boosting Approach to Multiple Instance Learning
In this paper we present a boosting approach to multiple instance learning. As weak hypotheses we use balls (with respect to various metrics) centered at instances of positive bags...
Peter Auer, Ronald Ortner
JMLR
2010
154views more  JMLR 2010»
13 years 2 months ago
MOA: Massive Online Analysis
Massive Online Analysis (MOA) is a software environment for implementing algorithms and running experiments for online learning from evolving data streams. MOA includes a collecti...
Albert Bifet, Geoff Holmes, Richard Kirkby, Bernha...