Sciweavers

TNN
2008

A Fast and Scalable Recurrent Neural Network Based on Stochastic Meta Descent

13 years 11 months ago
A Fast and Scalable Recurrent Neural Network Based on Stochastic Meta Descent
This brief presents an efficient and scalable online learning algorithm for recurrent neural networks (RNNs). The approach is based on the real-time recurrent learning (RTRL) algorithm, whereby the sensitivity set of each neuron is reduced to weights associated with either its input or output links. This yields a reduced storage and computational complexity of ( ). Stochastic meta descent (SMD), an adaptive step size scheme for stochastic gradient-descent problems, is employed as means of incorporating curvature information in order to substantially accelerate the learning process. We also introduce a clustered version of our algorithm to further improve its scalability attributes. Despite the dramatic reduction in resource requirements, it is shown through simulation results that the approach outperforms regular RTRL by almost an order of magnitude. Moreover, the scheme lends itself to parallel hardware realization by virtue of the localized property that is inherent to the learning f...
Zhenzhen Liu, Itamar Elhanany
Added 15 Dec 2010
Updated 15 Dec 2010
Type Journal
Year 2008
Where TNN
Authors Zhenzhen Liu, Itamar Elhanany
Comments (0)