Sciweavers

1707 search results - page 10 / 342
» A Boosting Algorithm for Regression
Sort
View
ICONIP
2004
13 years 9 months ago
The Most Robust Loss Function for Boosting
Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected ...
Takafumi Kanamori, Takashi Takenouchi, Shinto Eguc...
CSDA
2007
120views more  CSDA 2007»
13 years 8 months ago
Boosting ridge regression
Ridge regression is a well established method to shrink regression parameters towards zero, thereby securing existence of estimates. The present paper investigates several approac...
Gerhard Tutz, Harald Binder
KDD
2009
ACM
230views Data Mining» more  KDD 2009»
14 years 1 months ago
Grouped graphical Granger modeling methods for temporal causal modeling
We develop and evaluate an approach to causal modeling based on time series data, collectively referred to as“grouped graphical Granger modeling methods.” Graphical Granger mo...
Aurelie C. Lozano, Naoki Abe, Yan Liu, Saharon Ros...
IDEAL
2004
Springer
14 years 1 months ago
Orthogonal Least Square with Boosting for Regression
A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This technique tunes the mean vector and diagonal c...
Sheng Chen, Xunxian Wang, David J. Brown
ETVC
2008
13 years 10 months ago
Intrinsic Geometries in Learning
In a seminal paper, Amari (1998) proved that learning can be made more efficient when one uses the intrinsic Riemannian structure of the algorithms' spaces of parameters to po...
Richard Nock, Frank Nielsen