When: Tue, 13-16
Where: Orenstein 103
Instructor: Jonathan Berant
Graders: Ben Bogin (benb969), Mor Geva (mega.mor), Omri Koshorek (ko.omri), all at gmail.
Office hours: Coordinate by e-mail
Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP, and will mostly focus on methods from structured prediction and deep learning.
Machine learning is a prerequisite for this class. If you want to attend and did not take any machine learning class or something equivalent (MOOCs do not count), you should talk to the instructor. Some assignments will include writing code in Python, while in others you are free to choose any programming language.
|13/3||Word embeddings||Embeddings as matrix factorization||Assign. 1
neural LMs, FFNNs, RNNs
|Michael Collins' lecture notes, Neural embeddings, Backpropagation, Training RNNs|
|Michael Collins' HMM notes, Michael Collins' LLM notes, MEMMs, FAQ||Assign. 2|
|17/4||Global linear models||Michael Collins' lecture notes, CRFs and label bias|
|PCFG lecture notes, Lexicalized PCFG lecture notes||Assign. 3|
|1/5||Discriminative models for parsing||Ratnaparkhi, 97;Hall et al., 14;Neural CRF parsing
|8/5||Semantic parsing||Compositionality (Liang and Potts), Artzi et al. tutorial||Assign. 4|
|Clarke et al., 2010, Liang et al., 2011, Artzi and Zettlemoyer, Berant et al., 2013, Berant and Liang, 2015|
|22/5||Sequence to sequence
|LSTM, seq2seq, GRU, Attention, Pointer networks, Jia and Liang, 2016, Weak supervision, Guu et al, 2017||Assign. 5|
|CVG, CCG with guarantees|