Multi-lingual Learning Models (4/20/2021)
- Multilingual learning
- Multilingual pre-trained models
- Cross-lingual transfer methods
- Active learning
- Reference: Google's Multilingual Translation System (Johnson et al. 2016)
- Reference: Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT (Wu and Dredze 2019)
- Reference: Unsupervised Cross-lingual Representation Learning at Scale (Conneau et al. 2019)
- Reference: Massively Multilingual NMT (Aharoni et al. 2019)
- Reference: Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges (Arivazhagan et al. 2019)
- Reference: Balancing Training for Multilingual Neural Machine Translation (Wang et al. 2020)
- Reference: Multi-task Learning for Multiple Language Translation (Dong et al. 2015)
- Reference: Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism (Firat et al. 2016)
- Reference: Parameter Sharing Methods for Multilingual Self-Attentional Translation Models (Sachan and Neubig 2018)
- Reference: Contextual Parameter Generation for Universal Neural Machine Translation (Platanios et al. 2018)
- Reference: MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer (Pfeiffer et al. 2020)
- Reference: Cross-lingual Language Model Pretraining (Lample and Conneau 2019)
- Reference: Unicoder: A Universal Language Encoder by Pre-training with Multiple Cross-lingual Tasks (Huang et al. 2019)
- Reference: Explicit Alignment Objectives (Hu et al. 2020)
- Reference: XTREME (Hu et al. 2020)
- Reference: XGLUE (Liang et al. 2020)
- Reference: XTREME-R (Ruder et al. 2021)
- Reference: Rapida Adaptation to New Languages (Neubig and Hu 2018)
- Reference: Meta-learning for Low-resource Translation (Gu et al. 2018)
- Reference: How multilingual is Multilingual BERT? (Pires et al. 2019)
- Reference: Inducing Multilingual Text Analysis Tools via Robust Projection across Aligned Corpora (Yarowsky et al. 2001)
- Reference: Choosing Transfer Languages for Cross-Lingual Learning (Lin et al. 2019)
- Reference: Phonological Transfer for Entity Linking (Rijhwani et al. 2019)
- Reference: Handling Syntactic Divergence (Zhou et al. 2019)
- Reference: Support Vector Machine Active Learning with Applications to Text Classification (Tong and Koller 2001)
- Reference: Reducing labeling effort for structured prediction tasks (Culotta and McCallum 2005)
- Reference: Active Learning for Convolutional Neural Networks: A Core-Set Approach (Sener and Savarese 2017)
- Reference: A Little Annotation does a Lot of Good: A Study in Bootstrapping Low-resource Named Entity Recognizers (Chaudhary et al. 2019)
Slides: Multilingual Learning Slides
Video: Multilingual Video