Sciweavers

ACL
2015

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

8 years 7 months ago
Transition-Based Dependency Parsing with Stack Long Short-Term Memory
We propose a technique for learning representations of parser states in transitionbased dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks— the stack LSTM. Like the conventional stack data structures used in transitionbased parsing, elements can be pushed to or popped from the top of the stack in constant time, but, in addition, an LSTM maintains a continuous space embedding of the stack contents. This lets us formulate an efficient parsing model that captures three facets of a parser’s state: (i) unbounded look-ahead into the buffer of incoming words, (ii) the complete history of actions taken by the parser, and (iii) the complete contents of the stack of partially built tree fragments, including their internal structures. Standard backpropagation techniques are used for training and yield state-of-the-art parsing performance.
Chris Dyer, Miguel Ballesteros, Wang Ling, Austin
Added 13 Apr 2016
Updated 13 Apr 2016
Type Journal
Year 2015
Where ACL
Authors Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, Noah A. Smith
Comments (0)