Multi-task Multi-lingual Learning Models (4/17/2018)
- What is Multi-task Learning?
- Methods for Multi-task Learning
- Multi-task Objectives for NLP
- Required Reading (for quiz): Multi-task Learning in Neural Networks and Multi-task Objectives for NLP (Ruder 2017)
- Reference: Natural Language Processing from Scratch (Collobert et al. 2011)
- Reference: Regularization Techniques (Barone et al. 2017)
- Reference: Word Representations (Turian et al. 2010)
- Reference: Semi-supervised Sequence Learning (Dai and Le 2015)
- Reference: Gaze Prediction + Summarization (Klerke et al. 2016)
- Reference: Selective Transfer (Zoph et al. 2016)
- Reference: Soft Parameter Tying (Duong et al. 2015)
- Reference: Translation-based Encoder Pretraining (McCann et al. 2017)
- Reference: Bidirectional Language Model Pretraining (Peters et al. 2017)
- Reference: Pre-training for MT (Luong et al. 2015)
- Reference: Domain Adaptation via Feature Augmentation (Kim et al. 2016)
- Reference: Feature Augmentation w/ Tags (Chu et al. 2017)
- Reference: Unsupervised Adaptation (Long et al. 2015)
- Reference: Multilingual MT (Johnson et al. 2017)
- Reference: Muiltilingual MT (Ha et al. 2016)
- Reference: Teacher-student Multilingual NMT (Chen et al. 2017)
- Reference: Multiple Annotation Standards for Semantic Parsing (Peng et al. 2017)
- Reference: Multiple Annotation Standards for Word Segmentation (Chen et al. 2017)
- Reference: Modeling Annotator Variance (Guan et al. 2017)
- Reference: Different Layers for Different Tasks (Hashimoto et al. 2017)
- Reference: Polyglot Language Models (Tsvetkov et al. 2016)
- Reference: Many Languages One Parser (Ammar et al. 2016)
- Reference: Multilingual Relation Extraction (Lin et al. 2017)
Slides (from Fall 2017): Multitask Slides