Machine Reading w/ Neural Nets (4/10/2018)
- TBD
- No quiz
- Reference: MCTest (Richardson et al. 2013)
- Reference: RACE (Lai et al. 2017)
- Reference: SQuAD (Rajpurkar et al. 2016)
- Reference: TriviaQA (Joshi et al. 2017)
- Reference: Teaching Machines to Read and Comprehend (Hermann et al. 2015)
- Reference: Attention Sum (Kadlec et al. 2016)
- Reference: Attention over Attention (Cui et al. 2017)
- Reference: Bidirectional Attention Flow (Seo et al. 2017)
- Reference: Dynamic Coattention Networks (Xiong et al. 2017)
- Reference: Gated Attention Readers (Dhingra et al. 2017)
- Reference: Representation and Inference for Natural Language (Blackburn and Bos 1999)
- Reference: Memory Networks (Weston et al. 2015)
- Reference: End-to-end Memory Networks (Sukhbaatar et al. 2015)
- Reference: Dynamic Memory Networks (Kumar et al. 2016)
- Reference: Learning to Stop Reading (Shen et al. 2017)
- Reference: Coarse-to-fine Question Answering (Choi et al. 2017)
- Reference: Reading Wikipedia to Answer Open-Domain Questions (Chen et al. 2017)
- Reference: End-to-end Differentiable Proving (Rocktäschel and Riedel 2017)
- Reference: bAbI Dataset (Weston et al. 2015)
- Reference: NLP in Prolog (Pereira and Shieber 2002), Example Code
- Reference: A Thorough Examination of the CNN/Daily Mail Task (Chen et al. 2016)
- Reference: Adversarial Examples in SQuAD (Jia and Liang 2017)
Slides: Machine Reading Slides