Boosting algorithms like AdaBoost and Arc-GV are iterative strategies to minimize a constrained objective function, equivalent to Barrier algorithms. Based on this new understanding it is shown that convergence of Boosting-type algorithms becomes simpler to prove and we outline directions to develop further Boosting schemes. In particular a new Boosting technique for regression – § -Boost – is proposed.
Gunnar Rätsch, Manfred K. Warmuth, Sebastian