The power and sources of randomness in computation.
Connections and applications to computational complexity,
computational learning theory, cryptography and combinatorics.
Topics include:
(1) Basic tools: polynomial zero testing, uniform generation and
approximate counting, Fourier analysis, testing properties of
boolean functions, the influence of variables, random walks.
(2) Randomness in various computational settings: computational learning theory,
communication complexity, sublinear time and space computation,
probabilistic proofs, complexity theory.
(3) Generating randomness: pseudorandom generators, derandomization,
graph expansion, extractors.
Grading
Will be based on the solutions to homework exercises
and class participation. Undergraduates will also have a final exam.
Prerequisites
Algorithms. Exposure to computational complexity is helpful but not
required.
Time and location
The course will take place on Wednesdays, 11:00-14:00, in
room 209 of the Dan David building.
Course Information Handout
More details on syllabus, grading, etc. (pdf)
(see also "useful pointers" below)
Announcements
I need 8 volunteers to sign up for grading Homework 4 on Google Docs,
click on
"Homework 1 grading:":
(it is the same document that we used for Homework 1 and 2, hence
the name).
As in Homework 2, we will double up on the grading.
(March 13)
A randomized algorithm. Pairwise independence. Constructions
of pairwise independent sample spaces. Derandomization
via pairwise independence. On the power of 2 point sampling.
Avi's talk on
seeded extractors.
(latex source)
(March 18)
Interactive proofs, graph nonisomorphism, public vs. private coins
(using pairwise independent hashing). Begin uniform generation
of combinatorial objects: satisfying assignments of a DNF formula.
(latex source)
(March 25)
Uniformly generating DNF satisfying assignments.
Counting and approximate counting problems. Uniform generation vs.
approximate counting.
(latex source)
(April 22)
Markov chains. Random walks on graphs.
Stationary distributions. Cover time.
STCONN is in RL.
(latex source)
(May 6)
Linear Algebra review. Mixing times.
Mixing and saving random bits.
Begin mixing and uniform generation of matchings
of a graph.
(latex source)
(May 27)
Computational learning theory. Uniform
distribution learning.
Low degree algorithm and application to
learning AC0 functions.
(latex source)
(June 3)
Learning parity with noise: finding the large
Fourier coefficients given access to queries to the function.
Application to learning functions with bounded
L1 norm, decision trees.
(latex source)
For future scribes, here is a
sample file and
the preamble.tex file that
it uses.
Please get a first draft to me by the Sunday following the
lecture.
See
here
for a few tips.
No class on April 1. Instead of that, we will have
a makeup on Friday, March 13 from 9:00am-12:00pm in Room 7 Schreiber.
Note that the last hour of the makeup lecture will be to attend
Avi Wigderson's third Sackler lecture.
Professor Avi Wigderson from the Institute for Advanced Study in
Princeton will deliver the Raymond and Beverly Sackler
distinguished Lectures in Mathematics in March 9-13, 2009.
The titles and times are given below.
Lectures 2 and 3 contain content that is on the syllabus for the course,
and also happened to be scheduled at times that are during the regularly
scheduled course times. Thus, we will attend lectures 2 and 3
together as a class, and the material from those two lectures
will be considered required
material.
All lectures require no special background, and can be understood
independently from each other.
Lecture 1: The art of reduction (or, depth through breadth)
Monday, March 9, 12:15-13:15, Melamed Hall
Lecture 2: Deterministic randomness extractors
Wednesday, March 11, 12:15-13:15, Melamed Hall
Lecture 3: Seeded randomness extractors - applications and
constructions
Friday, March 13, 11-12, Schreiber 6
I've shared a document with you called
"Homework 1 grading:":
It's not an attachment -- it's stored online at Google Docs. To open this document, just click the link above.
I need five volunteers to sign up for grading homework 1.
Note that the newer version of homework 1 combined problems 5 and
6 into one problem.
Also, remember to turn in your solutions to each problem on separate
pieces of paper. I strongly encourage you to write your solutions in English.
We also ask that you typeset your solutions.
If you are not familiar with
Markov, Chebyshev and Hoeffding/Chernoff bounds, please read about
them in one or more of the descriptions given below in "useful pointers".