Sciweavers

1860 search results - page 56 / 372
» Boosting Methods for Regression
Sort
View
ICDM
2009
IEEE
172views Data Mining» more  ICDM 2009»
14 years 4 months ago
Sparse Least-Squares Methods in the Parallel Machine Learning (PML) Framework
—We describe parallel methods for solving large-scale, high-dimensional, sparse least-squares problems that arise in machine learning applications such as document classificatio...
Ramesh Natarajan, Vikas Sindhwani, Shirish Tatikon...
SIGIR
2005
ACM
14 years 3 months ago
Robustness of adaptive filtering methods in a cross-benchmark evaluation
This paper reports a cross-benchmark evaluation of regularized logistic regression (LR) and incremental Rocchio for adaptive filtering. Using four corpora from the Topic Detection...
Yiming Yang, Shinjae Yoo, Jian Zhang, Bryan Kisiel
GECCO
2005
Springer
100views Optimization» more  GECCO 2005»
14 years 3 months ago
Genetic programming as a method to develop powerful predictive models for clinical diagnosis
In the field of medicine it is of vital importance to accurately predict the presence of a disease (diagnostic prediction) or the future occurrence of a certain event (prognostic...
Ivar Siccama, Maarten Keijzer
GFKL
2005
Springer
82views Data Mining» more  GFKL 2005»
14 years 3 months ago
Robust Multivariate Methods: The Projection Pursuit Approach
Projection pursuit was originally introduced to identify structures in multivariate data clouds (Huber, 1985). The idea of projecting data to a lowdimensional subspace can also be ...
Peter Filzmoser, Sven Serneels, Christophe Croux, ...
NIPS
2004
13 years 11 months ago
Log-concavity Results on Gaussian Process Methods for Supervised and Unsupervised Learning
Log-concavity is an important property in the context of optimization, Laplace approximation, and sampling; Bayesian methods based on Gaussian process priors have become quite pop...
Liam Paninski