Conditioned Generation (2/18/2021)
Content:
- Encoder-Decoder Models
- Conditional Generation and Search
- Ensembling
- Evaluation
- Types of Data to Condition On
Reading Material
- Suggested Reading: Neural Machine Translation and Sequence-to-Sequence Models Chapter 7
- Reference: Recurrent Neural Translation Models (Kalchbrenner and Blunsom 2013)
- Reference: LSTM Encoder-Decoders (Sutskever et al. 2014)
- Reference: Generation from Structured Data (Wen et al. 2015)
- Reference: Challenges in Data-to-document Generation (Wisemen et al. 2017)
- Reference: Generation from Input+Tags (Zhou and Neubig 2017)
- Reference: Generation from TED Talks (Hoang et al. 2016)
- Reference: Generation from Images (Karpathy and Li 2015)
- Reference: Generation from Recipes (Kiddon et al. 2016)
- Reference: Generation from Word Embeddings (Noraset et al. 2017)
- Reference: WMT Transalation Tasks
- Reference: GENIE Leaderboard
- Reference: BLEU (Papineni et al. 2002)
- Reference: BertScore (Zhang et al. 2020)
- Reference: BLEURT (Sellam et al. 2020)
- Reference: COMET (Rei et al. 2020)
- Reference: PRISM (Thompson and Post 2020)
- Reference: WMT Metrics Shared Task (Mathur et al. 2020)
- Reference: Re-evaluating Evaluation in Text Summarization (Bhandari et al. 2020)
- Reference: Knowledge Distillation (Kim et al. 2016)
Slides: Conditional LM Slides
Video: Conditional LM Video
Sample Code: Conditional LM Code Examples