Machine Learning: Foundations (2010/11)

Tentative Class schedule:

1.       Introduction 

2.       Bayesian Inference

3.       PAC model and Occam Razor

4.       Online Learning: Mistake Bound, Winnow, Perceptron.                 [pptx]

5.       Regret Minimization

6.       Boosting and Margin

7.       VC dimension I  - definition and impossibility result                          [ppt]

8.       VC dimension II - sample bound (Rademacher complexity)

9.       Convex Programming and Support Vector Machine                         [Andrew Ng class notes] [pptx]

10.   Kernels, SVM and SMO algorithm

11.   Model Selection

12.   Decision Trees

13.   Fourier transform of Boolean functions                                                 [survey]

Homework

submission guidelines.

Homework 1  [note that the programming can be done also in Matlab]  comments

Homework 2

Homework 3

Homework 4

 

Data Sets

Iris

mnist

ISOLET

 

 

Scribe notes: Each student will write a scribe note for a lecture (template [pdf,tex] explanation on Latex [pdf,tex])

Scribe list

FINAL PROJECT

Courses on Machine Learning Elsewhere:

·         Introduction to machine leaning - Shai Shalev-Shwartz (HUJI)

·         Machine Learning Theory – Maria Florina Balcan (Georgia Tech)

·         Machine Learning Theory – Avrim Blum (CMU)

·         Statistical Learning Theory – Peter Bartlett (UC Berkely)

·         Machine Learning – Andrew Ng (Stanford)

·         Machine Learning – Tommi Jaakkola and Michael Collins (MIT)

·         Foundations of Machine Learning - Mahryar Mohri (NYU)