Sciweavers

NIPS
2000
13 years 8 months ago
Regularized Winnow Methods
In theory, the Winnow multiplicative update has certain advantages over the Perceptron additive update when there are many irrelevant attributes. Recently, there has been much eff...
Tong Zhang
NIPS
2000
13 years 8 months ago
Generalized Belief Propagation
In an important recent paper, Yedidia, Freeman, and Weiss [11] showed that there is a close connection between the belief propagation algorithm for probabilistic inference and the...
Jonathan S. Yedidia, William T. Freeman, Yair Weis...
NIPS
2000
13 years 8 months ago
Learning Winner-take-all Competition Between Groups of Neurons in Lateral Inhibitory Networks
It has long been known that lateral inhibition in neural networks can lead to a winner-take-all competition, so that only a single neuron is active at a steady state. Here we show...
Xiaohui Xie, Richard H. R. Hahnloser, H. Sebastian...
NIPS
2000
13 years 8 months ago
Feature Selection for SVMs
We introduce a method of feature selection for Support Vector Machines. The method is based upon finding those features which minimize bounds on the leave-one-out error. This sear...
Jason Weston, Sayan Mukherjee, Olivier Chapelle, M...
NIPS
2000
13 years 8 months ago
Mixtures of Gaussian Processes
We introduce the mixture of Gaussian processes (MGP) model which is useful for applications in which the optimal bandwidth of a map is input dependent. The MGP is derived from the...
Volker Tresp
NIPS
2000
13 years 8 months ago
Active Learning for Parameter Estimation in Bayesian Networks
Bayesian networks are graphical representations of probability distributions. In virtually all of the work on learning these networks, the assumption is that we are presented with...
Simon Tong, Daphne Koller
NIPS
2000
13 years 8 months ago
Adaptive Object Representation with Hierarchically-Distributed Memory Sites
Theories of object recognition often assume that only one representation scheme is used within one visual-processing pathway. Versatility of the visual system comes from having mu...
Bosco S. Tjan
NIPS
2000
13 years 8 months ago
Structure Learning in Human Causal Induction
We use graphical models to explore the question of how people learn simple causal relationships from data. The two leading psychological theories can both be seen as estimating th...
Joshua B. Tenenbaum, Thomas L. Griffiths
NIPS
2000
13 years 8 months ago
Rate-coded Restricted Boltzmann Machines for Face Recognition
We describe a neurally-inspired, unsupervised learning algorithm that builds a non-linear generative model for pairs of face images from the same individual. Individuals are then ...
Yee Whye Teh, Geoffrey E. Hinton