Sciweavers

218 search results - page 8 / 44
» Simplifying mixture models through function approximation
Sort
View
UAI
2003
13 years 9 months ago
Bayesian Hierarchical Mixtures of Experts
The Hierarchical Mixture of Experts (HME) is a well-known tree-structured model for regression and classification, based on soft probabilistic splits of the input space. In its o...
Christopher M. Bishop, Markus Svensén
PKDD
2009
Springer
152views Data Mining» more  PKDD 2009»
14 years 2 months ago
Feature Selection for Value Function Approximation Using Bayesian Model Selection
Abstract. Feature selection in reinforcement learning (RL), i.e. choosing basis functions such that useful approximations of the unkown value function can be obtained, is one of th...
Tobias Jung, Peter Stone
AAAI
2010
13 years 9 months ago
Gaussian Mixture Model with Local Consistency
Gaussian Mixture Model (GMM) is one of the most popular data clustering methods which can be viewed as a linear combination of different Gaussian components. In GMM, each cluster ...
Jialu Liu, Deng Cai, Xiaofei He
AAAI
2008
13 years 10 months ago
Adaptive Importance Sampling with Automatic Model Selection in Value Function Approximation
Off-policy reinforcement learning is aimed at efficiently reusing data samples gathered in the past, which is an essential problem for physically grounded AI as experiments are us...
Hirotaka Hachiya, Takayuki Akiyama, Masashi Sugiya...
ISM
2008
IEEE
110views Multimedia» more  ISM 2008»
14 years 2 months ago
A Hardware-Independent Fast Logarithm Approximation with Adjustable Accuracy
Many multimedia applications rely on the computation of logarithms, for example, when estimating log-likelihoods for Gaussian Mixture Models. Knowing of the demand to compute loga...
Oriol Vinyals, Gerald Friedland