Sciweavers

70 search results - page 4 / 14
» Using Generalization Error Bounds to Train the Set Covering ...
Sort
View
ML
2002
ACM
141views Machine Learning» more  ML 2002»
13 years 7 months ago
On the Existence of Linear Weak Learners and Applications to Boosting
We consider the existence of a linear weak learner for boosting algorithms. A weak learner for binary classification problems is required to achieve a weighted empirical error on t...
Shie Mannor, Ron Meir
ICML
2008
IEEE
14 years 8 months ago
Stopping conditions for exact computation of leave-one-out error in support vector machines
We propose a new stopping condition for a Support Vector Machine (SVM) solver which precisely reflects the objective of the Leave-OneOut error computation. The stopping condition ...
Klaus-Robert Müller, Pavel Laskov, Vojtech Fr...
LWA
2004
13 years 9 months ago
Modeling Rule Precision
This paper reports first results of an empirical study of the precision of classification rules on an independent test set. We generated a large number of rules using a general co...
Johannes Fürnkranz
ACL
2006
13 years 9 months ago
Minimum Risk Annealing for Training Log-Linear Models
When training the parameters for a natural language system, one would prefer to minimize 1-best loss (error) on an evaluation set. Since the error surface for many natural languag...
David A. Smith, Jason Eisner
PR
2007
104views more  PR 2007»
13 years 7 months ago
Optimizing resources in model selection for support vector machine
Tuning SVM hyperparameters is an important step in achieving a high-performance learning machine. It is usually done by minimizing an estimate of generalization error based on the...
Mathias M. Adankon, Mohamed Cheriet