Sciweavers

COLT
2000
Springer

Leveraging for Regression

14 years 4 months ago
Leveraging for Regression
In this paper we examine master regression algorithms that leverage base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its good theoretical bounds. We present three gradient descent leveraging algorithms for regression and prove AdaBoost-style bounds on their sample error using intuitive assumptions on the base learners. We derive bounds on the size of the master functions that lead to PAC-style bounds on the generalization error.
Nigel Duffy, David P. Helmbold
Added 24 Aug 2010
Updated 24 Aug 2010
Type Conference
Year 2000
Where COLT
Authors Nigel Duffy, David P. Helmbold
Comments (0)