Sciweavers

146 search results - page 10 / 30
» Online Gradient Descent Learning Algorithms
Sort
View
NIPS
1998
13 years 8 months ago
Batch and On-Line Parameter Estimation of Gaussian Mixtures Based on the Joint Entropy
We describe a new iterative method for parameter estimation of Gaussian mixtures. The new method is based on a framework developed by Kivinen and Warmuth for supervised on-line le...
Yoram Singer, Manfred K. Warmuth
KDD
2009
ACM
150views Data Mining» more  KDD 2009»
14 years 8 months ago
Information theoretic regularization for semi-supervised boosting
We present novel semi-supervised boosting algorithms that incrementally build linear combinations of weak classifiers through generic functional gradient descent using both labele...
Lei Zheng, Shaojun Wang, Yan Liu, Chi-Hoon Lee
CORR
2010
Springer
105views Education» more  CORR 2010»
13 years 7 months ago
Online Identification and Tracking of Subspaces from Highly Incomplete Information
This work presents GROUSE (Grassmanian Rank-One Update Subspace Estimation), an efficient online algorithm for tracking subspaces from highly incomplete observations. GROUSE requi...
Laura Balzano, Robert Nowak, Benjamin Recht
ICANN
2009
Springer
14 years 1 months ago
Evolving Memory Cell Structures for Sequence Learning
The best recent supervised sequence learning methods use gradient descent to train networks of miniature nets called memory cells. The most popular cell structure seems somewhat ar...
Justin Bayer, Daan Wierstra, Julian Togelius, J&uu...
JMLR
2012
11 years 9 months ago
SpeedBoost: Anytime Prediction with Uniform Near-Optimality
We present SpeedBoost, a natural extension of functional gradient descent, for learning anytime predictors, which automatically trade computation time for predictive accuracy by s...
Alexander Grubb, Drew Bagnell