Sciweavers

ICML
2008
IEEE

Training restricted Boltzmann machines using approximations to the likelihood gradient

14 years 11 months ago
Training restricted Boltzmann machines using approximations to the likelihood gradient
A new algorithm for training Restricted Boltzmann Machines is introduced. The algorithm, named Persistent Contrastive Divergence, is different from the standard Contrastive Divergence algorithms in that it aims to draw samples from almost exactly the model distribution. It is compared to some standard Contrastive Divergence and Pseudo-Likelihood algorithms on the tasks of modeling and classifying various types of data. The Persistent Contrastive Divergence algorithm outperforms the other algorithms, and is equally fast and simple.
Tijmen Tieleman
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2008
Where ICML
Authors Tijmen Tieleman
Comments (0)