Sciweavers

1648 search results - page 219 / 330
» Training Data Selection for Support Vector Machines
Sort
View
JMLR
2010
104views more  JMLR 2010»
13 years 4 months ago
Increasing Feature Selection Accuracy for L1 Regularized Linear Models
L1 (also referred to as the 1-norm or Lasso) penalty based formulations have been shown to be effective in problem domains when noisy features are present. However, the L1 penalty...
Abhishek Jaiantilal, Gregory Z. Grudic
ICPR
2004
IEEE
14 years 10 months ago
Applying A Hybrid Method To Handwritten Character Recognition
In this paper, we propose a new prototype learning/matching method that can be combined with support vector machines (SVM) in pattern recognition. This hybrid method has the follo...
Chin-Chin Lin, Chun-Jen Chen, Fu Chang
TKDE
2010
168views more  TKDE 2010»
13 years 7 months ago
Completely Lazy Learning
—Local classifiers are sometimes called lazy learners because they do not train a classifier until presented with a test sample. However, such methods are generally not complet...
Eric K. Garcia, Sergey Feldman, Maya R. Gupta, San...
UIST
1994
ACM
14 years 1 months ago
A Perceptually-Supported Sketch Editor
The human visual system makes a great deal more of images than the elemental marks on a surface. In the course of viewing, creating, or editing a picture, we actively construct a ...
Eric Saund, Thomas P. Moran
SDM
2012
SIAM
216views Data Mining» more  SDM 2012»
11 years 11 months ago
Feature Selection "Tomography" - Illustrating that Optimal Feature Filtering is Hopelessly Ungeneralizable
:  Feature Selection “Tomography” - Illustrating that Optimal Feature Filtering is Hopelessly Ungeneralizable George Forman HP Laboratories HPL-2010-19R1 Feature selection; ...
George Forman