Natural Language Processing (NLP) aims to develop methods for processing, analyzing and understanding natural language. The goal of this class is to provide a thorough overview of modern methods in the field of Natural Language Processing. The class will not assume prior knowledge in NLP, and will mostly focus on methods from structured prediction and deep learning.
Machine learning is a prerequisite for this class. If you want to attend and did not take any machine learning class or something equivalent you may join the class given my approval (update: since the class is full this is becoming less likely). Proficiency in programming is required. Some assignments will require python, while in others you can choose whatever programming language you feel comfortable with.
|word2vec, GloVe, Embeddings as matrix factorization|
|Michael Collins' lecture notes||Assignment 1 out
neural LMs, FFNNs, RNNs
|Neural embeddings, Backpropagation, Training RNNs|
|Michael Collins' HMM notes, Michael Collins' LLM notes, MEMMs, FAQ||Assignment 2 out
|5/4||Global linear models||Michael Collins' lecture notes, CRFs and label bias|
|PCFG lecture notes, Lexicalized PCFG lecture notes||Assignment 3 out
|9/5||Discriminative models for parsing||Ratnaparkhi, 97;Hall et al., 14;Neural CRF parsing
|16/5||Semantic parsing||Compositionality (Liang and Potts), Artzi et al. tutorial|
|Clarke et al., 2010, Liang et al., 2011, Artzi and Zettlemoyer, Berant et al., 2013, Berant and Liang, 2015||Assignment 4 out
|6/6||Sequence to sequence
|LSTM, seq2seq, GRU, Attention, Pointer networks, Jia and Liang, 2016, Weak supervision, Guu et al, 2017|
|CVG, CCG with guarantees|