Advanced Methods in Natural Language Processing – Spring 2017

When: Tue, 13-16
Where: Dan David 203
Lecturer: Jonathan Berant
Grader: Dor Muhlgay dormuhlg at mail dot tau etc
Office hours: Coordinate by e-mail
Forum: Moodle


Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP, and will mostly focus on methods from structured prediction and deep learning.


Machine learning is a prerequisite for this class. If you want to attend and did not take any machine learning class or something equivalent you may join the class given my approval (update: since the class is full this is becoming less likely). Proficiency in programming is required. Some assignments will require python, while in others you can choose whatever programming language you feel comfortable with.

  1. Homework assignments: There will be 4 or 5 homework assignments that will constitute 50% of the final grade. Assignments should be submitted in pairs according to the instructions on the assignment. You get 5 late days throughout the semester and then it's 5 points per day per assignment.
  2. Project: A final project will constitute the other 50% of the final grade. In your final project you can either implement an algorithm/model/system from a paper or try to define and attack your own research problem. Projects will be done in groups of three, and will be presented in the last class (10 min. per group). A project proposal will be due on 20.5.2017. You will present what you have done so far and plan in one of the last two classes. The final project is due on Sep 10th, and will include a detailed ``paper-like" report of what you have done and code. It is possible (and recommended if possible) for groups to have one project for both this class as well as the advanced ML class. If you do that your final project needs to be bigger.
Recommended reading
Very tentative schedule

Date Topic Reading Comments
14/3 Introduction
Word embeddings
word2vec, GloVe, Embeddings as matrix factorization
21/3 Language models
Michael Collins' lecture notes Assignment 1 out
due: 4/4/2017
28/3 Language models
neural LMs, FFNNs, RNNs
Computation graphs
Neural embeddings, Backpropagation, Training RNNs
4/4 Tagging
Log-linear models
Michael Collins' HMM notes, Michael Collins' LLM notes, MEMMs, FAQ Assignment 2 out
5/4 Global linear models Michael Collins' lecture notes, CRFs and label bias
25/4 Syntax, grammars
Lexicalized PCFGs
PCFG lecture notes, Lexicalized PCFG lecture notes Assignment 3 out
9/5 Discriminative models for parsing Ratnaparkhi, 97;Hall et al., 14;Neural CRF parsing
Shift-reduce parsing
16/5 Semantic parsing Compositionality (Liang and Potts), Artzi et al. tutorial
23/5 Semantic parsing:
Clarke et al., 2010, Liang et al., 2011, Artzi and Zettlemoyer, Berant et al., 2013, Berant and Liang, 2015 Assignment 4 out
6/6 Sequence to sequence
LSTM, seq2seq, GRU, Attention, Pointer networks, Jia and Liang, 2016, Weak supervision, Guu et al, 2017
13/6 RL, Tree-RNN
CVG, CCG with guarantees
20/6 Projects
27/6 Projects