Modeling 1 - Recurrent Networks (9/8/2022)
Content:
- Recurrent Networks
- Vanishing Gradient and LSTMs
- Strengths and Weaknesses of Recurrence in Sentence Modeling
- Pre-training for RNNs
Reading Material
- Recommended Reading: Goldberg Book Chapter 13 pp. 151-157, all of Chapter 14-15
- Other Reading: Goldberg Book Chapter 16 (covered in class)
- Reference: RNNs (Elman 1990)
- Reference: LSTM (Hochreiter and Schmidhuber 1997)
- Reference: Variants of LSTM (Greff et al. 2015)
- Reference: GRU (Cho et al. 2014)
- Reference: Visualizing Recurrent Nets (Karpathy et al. 2015)
- Reference: Learning Syntax from Translation (Shi et al. 2016)
- Reference: Learning Sentiment from LMs (Radford et al. 2017)
- Reference: Optimizing RNNs with CuDNN (Appleyard 2015)
Slides: RNN Slides
Sample Code: RNN Code Examples