Sciweavers

NIPS
1998

Global Optimisation of Neural Network Models via Sequential Sampling

14 years 1 months ago
Global Optimisation of Neural Network Models via Sequential Sampling
We propose a novel strategy for training neural networks using sequential Monte Carlo algorithms. This global optimisation strategy allows us to learn the probability distribution of the network weights in a sequential framework. It is well suited to applications involving on-line, nonlinear or non-stationary signal processing. We show how the new algorithms can outperform extended Kalman filter (EKF) training.
João F. G. de Freitas, Mahesan Niranjan, Ar
Added 01 Nov 2010
Updated 01 Nov 2010
Type Conference
Year 1998
Where NIPS
Authors João F. G. de Freitas, Mahesan Niranjan, Arnaud Doucet, Andrew H. Gee
Comments (0)