Sciweavers
Explore
Publications
Books
Software
Tutorials
Presentations
Lectures Notes
Datasets
Labs
Conferences
Community
Upcoming
Conferences
Top Ranked Papers
Most Viewed Conferences
Conferences by Acronym
Conferences by Subject
Conferences by Year
Tools
Sci2ools
International Keyboard
Graphical Social Symbols
CSS3 Style Generator
OCR
Web Page to Image
Web Page to PDF
Merge PDF
Split PDF
Latex Equation Editor
Extract Images from PDF
Convert JPEG to PS
Convert Latex to Word
Convert Word to PDF
Image Converter
PDF Converter
Community
Sciweavers
About
Terms of Use
Privacy Policy
Cookies
Free Online Productivity Tools
i2Speak
i2Symbol
i2OCR
iTex2Img
iWeb2Print
iWeb2Shot
i2Type
iPdf2Split
iPdf2Merge
i2Bopomofo
i2Arabic
i2Style
i2Image
i2PDF
iLatex2Rtf
Sci2ools
32
click to vote
COLT
2010
Springer
favorite
Email
discuss
report
186
views
Machine Learning
»
more
COLT 2010
»
The Convergence Rate of AdaBoost
13 years 10 months ago
Download
www.cs.princeton.edu
Abstract. We pose the problem of determining the rate of convergence at which AdaBoost minimizes exponential loss. Boosting is the problem of combining many "weak," high-error hypotheses to generate a single "strong"
Robert E. Schapire
Real-time Traffic
AdaBoost
|
COLT 2010
|
Convergence
|
Exponential Loss
|
Machine Learning
|
claim paper
Post Info
More Details (n/a)
Added
10 Feb 2011
Updated
10 Feb 2011
Type
Journal
Year
2010
Where
COLT
Authors
Robert E. Schapire
Comments
(0)
Researcher Info
Machine Learning Study Group
Computer Vision