This page contains some of the slides that I've made for tutorials, etc., generally aimed at the Masters or early PhD students in our lab. If you have any questions, corrections, or comments I'd be glad to hear them. If you'd like me to present on these (or something similar), send me an email, I like doing tutorials!
- Spring 2021: Neural Networks for NLP (CS11-747 @ CMU)
- Fall 2020: Multilingual NLP (CS11-737 @ CMU)
- Spring 2020: Neural Networks for NLP (CS11-747 @ CMU)
- Fall 2019: Machine Translation and Sequence-to-sequence Models (CS11-731 @ CMU)
- Spring 2019: Neural Networks for NLP (CS11-747 @ CMU)
- Fall 2018: Machine Translation and Sequence-to-sequence Models (CS11-731 @ CMU)
- Spring 2018: Neural Networks for NLP (CS11-747 @ CMU)
- Fall 2017: Neural Networks for NLP (CS11-747 @ CMU)
- Spring 2017: Machine Translation and Sequence-to-sequence Models (CS11-731 @ CMU)
- 2015: Sequential Data Modeling (@NAIST; 2 sessions)
- 2014: Sequential Data Modeling (@NAIST; 2 sessions)
- 2014: Speech Information Processing (@Kansai University; 2 sessions)
- 2013: Advanced Research Seminar: Machine Translation
- 2013: Sequential Data Modeling (@NAIST; Structured Perceptron, Conditional Random Fields)
- 2013: Speech Information Processing (@Kansai University; 2 sessions)
- 2012: Intelligent System Design (@NAIST; Machine Translation)
I've (co-)written a number of textbooks on technical topics (mostly in Japanese at the moment).
Natural Language Processing: Basics and Technology
Iwanami Data Science (Volume 2: Natural Language Processing)
Practical Neural Networks for NLP: From Theory to Code
A tutorial done by Chris Dyer, Yoav Goldberg, and me at EMNLP 2016. It covers how to turn your NLP ideas into code for neural network models, with a focus on our toolkit DyNet.
- Part 1: Neural Nets, Recurrent Nets, and More
- Part 2: More Complicated RNNs, Tree Structured Networks, Structured Prediction
NLP Programming Tutorial
This is a tutorial I do at NAIST for people to start learning how to program basic algorithms for natural language processing. You should need very little programming experience to start out, but each of the tutorials builds on the stuff from the previous tutorials, so it is highly recommended that you do them in order. You can download the slides and data for the practice exercises from github.
- Tutorial 0: Programming Basics
- Tutorial 1: Unigram Language Models
- Tutorial 2: Bigram Language Models
- Tutorial 3: Word Segmentation
- Tutorial 4: Part-of-Speech Tagging with Hidden Markov Models
- Tutorial 5: The Perceptron Algorithm
- Tutorial 6: Advanced Discriminative Training
- Tutorial 7: Neural Networks
- Tutorial 8: Recurrent Neural Networks
- Tutorial 9: Topic Models
- Tutorial 10: Phrase Structure Parsing
- Tutorial 11: Dependency Parsing
- Tutorial 12: Structured Perceptron
- Tutorial 13: Search Algorithms
- Bonus 1: Kana-Kanji Conversion for Japanese Input
Tips on Building Neural Translation Systems
I wrote a fairly extensive tutorial on some of the things that you need to do to make a good neural machine translation system (circa July 2016). I plan to update this occasionally when new methods come out, so feel free to "follow" it for updates.
Building a Phrase-Based Machine Translation System
This tutorial by (by me and Kevin) covers the many steps that are involved in building a phrase-based machine translation system, specifically what goes on when training a system using Moses. For each of the many steps, it describes the processing that occurs, open source tools that can be used to do this porcessing, and some of the existing research problems.
Bayesian Non-Parametrics Tutorial
This is a simple tutorial about non-parametric Bayesian techniques consisting of several parts. The first part discusses the basic motivations and theory behind the use of Bayesian non-parametrics. The second part demonstrates how to implement unsupervised part of speech induction using Gibbs sampling and the Bayesian HMM, followed by an explanation of how to go from the finite HMM to the infinite HMM. Finally, the tutorial covers some more advanced topics and applications proposed in the recent literature.