Generating Trees or Graphs Incrementally (2/20/2020)
Content:
- What is Transition-based Parsing?
- Shift-reduce Parsing w/ Feed-forward Nets
- Stack LSTM
- Transition-based Models for Phrase Structure
- A Simple Alternative: Linearized Trees
- Recommended Reading (no quiz): Dependency Parsing Jurafsky and Martin Chapter 15-15.4
- Reference: Shift-reduce Parsing (Yamada and Matsumoto 2003)
- Reference: Shift-reduce Parsing (Nivre 2003)
- Reference: Feature Engineering for Parsing (Zhang and Nivre 2011)
- Reference: Feed-forward Dependency Parsing (Chen and Manning 2014)
- Reference: Recursive RNNs (Socher et al. 2011)
- Reference: Tree-structured LSTM (Tai et al. 2015)
- Reference: Stack LSTM Dependency Parsing (Dyer et al. 2015)
- Reference: Easy-first Neural Parsing (Kiperwasser and Goldberg 2016)
- Reference: Top-down Neural Parsing (Ma et al. 2018)
- Reference: Shift-reduce Phrase Structure Parsing (Watanabe et al. 2015)
- Reference: Recurrent Neural Network Gramamrs (Dyer et al. 2016)
- Reference: Linearized Trees (Vinyals et al. 2015)
- Reference: When is Structure Useful (Kuncoro et al. 2018)
Slides: Incremental Parsing Slides
Sample Code: Incremental Parsing Code Examples