We introduce a boosting framework to solve a classification problem with added manifold and ambient regularization costs. It allows for a natural extension of boosting into both semisupervised problems and unsupervised problems. The augmented cost is minimized in a greedy, stagewise functional minimization procedure as in GradientBoost. Our method provides insights into generalization issues in GradientBoost as applied to trees; these phenomena are relevant also to manifold learning. We describe a quite general framework and then discuss a specific case based on L2 TreeBoost. This framework naturally accommodates supervised learning, manifold learning, partially supervised learning and unsupervised clustering as particular cases. Multiclass learning tasks fit naturally into the framework as well. Unlike other manifold learning approaches, the family of algorithms derived has linear complexity in the number of datapoints. The performance of our method is at the state of the art on some...
Nicolas Loeff, David A. Forsyth, Deepak Ramachandr