Sciweavers

ECML
2007
Springer

Additive Groves of Regression Trees

14 years 5 months ago
Additive Groves of Regression Trees
We present a new regression algorithm called Additive Groves and show empirically that it is superior in performance to a number of other established regression methods. A single Grove is an additive model containing a small number of large trees. Trees added to a Grove are trained on the residual error of other trees already in the model. We begin the training process with a single small tree and gradually increase both the number of trees in the Grove and their size. This procedure ensures that the resulting model captures the additive structure of the response. A single Grove may still overfit to the training set, so we further decrease the variance of the final predictions with bagging. We show that in addition to exhibiting superior performance on a suite of regression test problems, Additive Groves are very resistant to overfitting.
Daria Sorokina, Rich Caruana, Mirek Riedewald
Added 07 Jun 2010
Updated 07 Jun 2010
Type Conference
Year 2007
Where ECML
Authors Daria Sorokina, Rich Caruana, Mirek Riedewald
Comments (0)