When: Tue, 13-16
Where: Orenstein 103
Instructor: Jonathan Berant
Graders: Ben Bogin (benb969), Mor Geva (mega.mor), Omri Koshorek (ko.omri), all at gmail.
Office hours: Coordinate by e-mail
Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP, and will mostly focus on methods from structured prediction and deep learning.
Machine learning is a prerequisite for this class. If you want to attend and did not take any machine learning class or something equivalent (MOOCs do not count), you should talk to the instructor. Some assignments will include writing code in Python, while in others you are free to choose any programming language.
|13/3||Word embeddings||Embeddings as matrix factorization||Assign. 1
neural LMs, FFNNs
|Michael Collins' lecture notes, Neural embeddings, Backpropagation|
Recurrent language models
RNNs, LSTMs, GRUs
more LSTM GRU
|Training RNNs, LSTMs||Assign.
|Michael Collins' lecture notes, Michael Collins' HMM notes, Michael Collins' LLM notes, MEMMs, FAQ|
|24/4||Globally-normalized linear models, Deep learning for tagging||CRFs
and label bias
Globally vs. locally normalized models
BiLSTM CRF for tagging
|1/5||Introduction to parsing
|PCFG lecture notes|
| Lexicalized PCFG lecture notes
Ratnaparkhi, 97;Hall et al., 14;
Deep syntactic parsing
Semantic parsing intro
|Neural CRF parsing,
Minimal span-based neural
|Clarke et al., 2010, Liang et al., 2011, Artzi and Zettlemoyer, Berant et al., 2013, Berant and Liang, 2015||Assign.
|29/5||Sequence to sequence
||seq2seq, Attention, Pointer networks, Jia and Liang, 2016, Weak supervision, Guu et al, 2017|
|5/6||Weakly-supervised sequence to sequence modesl|