Sciweavers

ESSMAC
2003
Springer

Analysis of Some Methods for Reduced Rank Gaussian Process Regression

14 years 5 months ago
Analysis of Some Methods for Reduced Rank Gaussian Process Regression
Abstract. While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning the covariance function hyperparameters and the support set. We propose a method for le...
Joaquin Quiñonero Candela, Carl Edward Rasm
Added 06 Jul 2010
Updated 06 Jul 2010
Type Conference
Year 2003
Where ESSMAC
Authors Joaquin Quiñonero Candela, Carl Edward Rasmussen
Comments (0)