Sciweavers

SDM
2012
SIAM

Learning from Heterogeneous Sources via Gradient Boosting Consensus

12 years 1 months ago
Learning from Heterogeneous Sources via Gradient Boosting Consensus
Multiple data sources containing different types of features may be available for a given task. For instance, users’ profiles can be used to build recommendation systems. In addition, a model can also use users’ historical behaviors and social networks to infer users’ interests on related products. We argue that it is desirable to collectively use any available multiple heterogeneous data sources in order to build effective learning models. We call this framework heterogeneous learning. In our proposed setting, data sources can include (i) nonoverlapping features, (ii) non-overlapping instances, and (iii) multiple networks (i.e. graphs) that connect instances. In this paper, we propose a general optimization framework for heterogeneous learning, and devise a corresponding learning model from gradient boosting. The idea is to minimize the empirical loss with two constraints: (1) There should be consensus among the predictions of overlapping instances (if any) from different d...
Xiaoxiao Shi, Jean-François Paiement, David
Added 29 Sep 2012
Updated 29 Sep 2012
Type Journal
Year 2012
Where SDM
Authors Xiaoxiao Shi, Jean-François Paiement, David Grangier, Philip S. Yu
Comments (0)