Sciweavers

NN
2007
Springer

Learning grammatical structure with Echo State Networks

13 years 12 months ago
Learning grammatical structure with Echo State Networks
Echo State Networks (ESNs) have been shown to be effective for a number of tasks, including motor control, dynamic time series prediction, and memorizing musical sequences. However, their performance on natural language tasks has been largely unexplored until now. Simple Recurrent Networks (SRNs) have a long history in language modeling and show a striking similarity in architecture to ESNs. A comparison of SRNs and ESNs on a natural language task is therefore a natural choice for experimentation. Elman applies SRNs to a standard task in statistical NLP: predicting the next word in a corpus, given the previous words. Using a simple context-free grammar and an SRN with backpropagation through time (BPTT), Elman showed that the network was able to learn internal representations that were sensitive to linguistic processes that were useful for the prediction task. Here, using ESNs, we show that training such internal representations is unnecessary to achieve levels of performance comparab...
Matthew H. Tong, Adam D. Bickett, Eric M. Christia
Added 27 Dec 2010
Updated 27 Dec 2010
Type Journal
Year 2007
Where NN
Authors Matthew H. Tong, Adam D. Bickett, Eric M. Christiansen, Garrison W. Cottrell
Comments (0)