Sciweavers

596 search results - page 40 / 120
» Learning noise
Sort
View
125
Voted
NIPS
2004
15 years 4 months ago
Log-concavity Results on Gaussian Process Methods for Supervised and Unsupervised Learning
Log-concavity is an important property in the context of optimization, Laplace approximation, and sampling; Bayesian methods based on Gaussian process priors have become quite pop...
Liam Paninski
132
Voted
DSMML
2004
Springer
15 years 8 months ago
Extensions of the Informative Vector Machine
The informative vector machine (IVM) is a practical method for Gaussian process regression and classification. The IVM produces a sparse approximation to a Gaussian process by com...
Neil D. Lawrence, John C. Platt, Michael I. Jordan
97
Voted
NIPS
2000
15 years 4 months ago
An Information Maximization Approach to Overcomplete and Recurrent Representations
The principle of maximizing mutual information is applied to learning overcomplete and recurrent representations. The underlying model consists of a network of input units driving...
Oren Shriki, Haim Sompolinsky, Daniel D. Lee
121
Voted
CVPR
2009
IEEE
16 years 9 months ago
Let the Kernel Figure it Out; Principled Learning of Pre-processing for Kernel Classifiers
Most modern computer vision systems for high-level tasks, such as image classification, object recognition and segmentation, are based on learning algorithms that are able to se...
Peter V. Gehler, Sebastian Nowozin
142
Voted
ICAC
2006
IEEE
15 years 8 months ago
Fast and Effective Worm Fingerprinting via Machine Learning
— As Internet worms become ever faster and more sophisticated, it is important to be able to extract worm signatures in an accurate and timely manner. In this paper, we apply mac...
Stewart M. Yang, Jianping Song, Harish Rajamani, T...