Sciweavers

997 search results - page 16 / 200
» Completely Lazy Learning
Sort
View
ECCV
2010
Springer
14 years 2 months ago
Efficient Highly Over-Complete Sparse Coding using a Mixture Model
Sparse coding of sensory data has recently attracted notable attention in research of learning useful features from the unlabeled data. Empirical studies show that mapping the data...
IJCNN
2007
IEEE
14 years 4 months ago
Encoding Complete Body Models Enables Task Dependent Optimal Behavior
— Many neural network models of (human) motor learning focus on the acquisition of direct goal-to-action mappings, which results in rather inflexible motor control programs. We ...
Oliver Herbort, Martin V. Butz
ICML
2000
IEEE
14 years 10 months ago
FeatureBoost: A Meta-Learning Algorithm that Improves Model Robustness
Most machine learning algorithms are lazy: they extract from the training set the minimum information needed to predict its labels. Unfortunately, this often leads to models that ...
Joseph O'Sullivan, John Langford, Rich Caruana, Av...
CORR
2000
Springer
132views Education» more  CORR 2000»
13 years 9 months ago
A Comparison between Supervised Learning Algorithms for Word Sense Disambiguation
This paper describes a set of comparative experiments, including cross{corpus evaluation, between ve alternative algorithms for supervised Word Sense Disambiguation (WSD), namely ...
Gerard Escudero, Lluís Màrquez, Germ...
ML
2008
ACM
13 years 9 months ago
Unrestricted pure call-by-value recursion
Call-by-value languages commonly restrict recursive definitions by only allowing functions and syntactically explicit values in the right-hand sides. As a consequence, some very a...
Johan Nordlander, Magnus Carlsson, Andy Gill