Conditioned Generation (1/30/2020)
Content:
- Encoder-Decoder Models
- Conditional Generation and Search
- Ensembling
- Evaluation
- Types of Data to Condition On
Reading Material
- Required Reading (for quiz): Neural Machine Translation and Sequence-to-Sequence Models Chapter 7
- Reference: Recurrent Neural Translation Models (Kalchbrenner and Blunsom 2013)
- Reference: LSTM Encoder-Decoders (Sutskever et al. 2014)
- Reference: BLEU (Papineni et al. 2002)
- Reference: METEOR (Banerjee and Lavie 2005)
- Reference: Knowledge Distillation (Kim et al. 2016)
- Reference: Generation from Structured Data (Wen et al. 2015)
- Reference: Challenges in Data-to-document Generation (Wisemen et al. 2017)
- Reference: Generation from Input+Tags (Zhou and Neubig 2017)
- Reference: Generation from TED Talks (Hoang et al. 2016)
- Reference: Generation from Images (Karpathy and Li 2015)
- Reference: Generation from Recipes (Kiddon et al. 2016)
- Reference: Generation from Word Embeddings (Noraset et al. 2017)
Slides: Conditional LM Slides
Sample Code: Conditional LM Code Examples