In theory, the Winnow multiplicative update has certain advantages over the Perceptron additive update when there are many irrelevant attributes. Recently, there has been much eff...
In an important recent paper, Yedidia, Freeman, and Weiss [11] showed that there is a close connection between the belief propagation algorithm for probabilistic inference and the...
Jonathan S. Yedidia, William T. Freeman, Yair Weis...
It has long been known that lateral inhibition in neural networks can lead to a winner-take-all competition, so that only a single neuron is active at a steady state. Here we show...
Xiaohui Xie, Richard H. R. Hahnloser, H. Sebastian...
We introduce a method of feature selection for Support Vector Machines. The method is based upon finding those features which minimize bounds on the leave-one-out error. This sear...
Jason Weston, Sayan Mukherjee, Olivier Chapelle, M...
We introduce the mixture of Gaussian processes (MGP) model which is useful for applications in which the optimal bandwidth of a map is input dependent. The MGP is derived from the...
Bayesian networks are graphical representations of probability distributions. In virtually all of the work on learning these networks, the assumption is that we are presented with...
Theories of object recognition often assume that only one representation scheme is used within one visual-processing pathway. Versatility of the visual system comes from having mu...
We use graphical models to explore the question of how people learn simple causal relationships from data. The two leading psychological theories can both be seen as estimating th...
We describe a neurally-inspired, unsupervised learning algorithm that builds a non-linear generative model for pairs of face images from the same individual. Individuals are then ...