Graham Neubig's Teaching

Classes

Textbooks

I've (co-)written a number of textbooks on technical topics (mostly in Japanese at the moment).

Machine Translation
Corona-sha, 2014
Natural Language Processing: Basics and Technology
Shoeisha, 2016
Iwanami Data Science (Volume 2: Natural Language Processing)
Iwanami, 2016

Older Materials

Practical Neural Networks for NLP: From Theory to Code

A tutorial done by Chris Dyer, Yoav Goldberg, and me at EMNLP 2016. It covers how to turn your NLP ideas into code for neural network models, with a focus on our toolkit DyNet.

NLP Programming Tutorial

This is a tutorial I used to do at NAIST for people to start learning how to program basic algorithms for natural language processing. You should need very little programming experience to start out, but each of the tutorials builds on the stuff from the previous tutorials, so it is highly recommended that you do them in order. You can download the slides and data for the practice exercises from github.

Machine Translation

Note that the following are now (as of August 2022) very out of date, I'm keeping them here for historical purposes.

Tips on Building Neural Translation Systems

I wrote a fairly extensive tutorial on some of the things that you need to do to make a good neural machine translation system (circa July 2016).

Building a Phrase-Based Machine Translation System

This tutorial by (by me and Kevin) covers the many steps that are involved in building a phrase-based machine translation system, specifically what goes on when training a system using Moses. For each of the many steps, it describes the processing that occurs, open source tools that can be used to do this porcessing, and some of the existing research problems.

Slides: [PDF] [ODP]

Bayesian Non-Parametrics Tutorial

This is a simple tutorial about non-parametric Bayesian techniques consisting of several parts. The first part discusses the basic motivations and theory behind the use of Bayesian non-parametrics. The second part demonstrates how to implement unsupervised part of speech induction using Gibbs sampling and the Bayesian HMM, followed by an explanation of how to go from the finite HMM to the infinite HMM. Finally, the tutorial covers some more advanced topics and applications proposed in the recent literature.

Slides: [PDF] [ODP]

Other Presentations and Useful Information