CSE 525: Randomized algorithms and probabilistic analysis

Winter, 2015

Instructor: James R. Lee

Office hours: Thursday, 12-1:30pm in CSE 640 or by appointment

TA: Makrand Sinha (makrand@cs)

Office hours: Monday, 12-1pm in CSE 306 or by appointment

Class email list: cse525a_wi15@uw


Note that the "bonus material" is not required reading. It's there if you're interested in learning more about the topic of the lecture and its broader applications.

class topic notes bonus material
6-Jan some basics pdf Schwartz-Zippel Lemma
8-Jan the probabilistic method, linearity of expectation pdf Incidence geometry and sum-product estimates,
Flows and eigenvalues
13-Jan the second moment method pdf See Chapter 5 of Lyons-Peres
15-Jan a second helping of second moments pdf Cover time of the discrete torus
20-Jan Chernoff bounds and randomized rounding pdf Concentration inequalities: A nonasymptotic theory of indepenence
22-Jan the method of Laplace transforms pdf
27-Jan martingales and Azuma's inequality pdf
29-Jan ''   '' pdf
3-Feb Bourgain's embedding Matousek's lecture notes on metric embeddings
5-Feb memoryless random variables and low-diameter partitions pdf
10-Feb the curse of dimensionality and dimension reduction pdf
12-Feb compressive sensing and the RIP pdf The Effectiveness of Convex Programming in the Information and Physical Sciences (video)
19-Feb concentration for sums of independent random matrices pdf Proof of Golden-Thompson,
Trace inequalities and quantum entropy
24-Feb ''   '' pdf
26-Feb spectral sparsification via sampling pdf Twice Ramanujan sparsifiers
3-Mar random walks: hitting times, commute times, and cover times pdf Random walks and electric networks
(Doyle & Snell)
5-Mar ''     '' pdf
10-Mar Markov chains and mixing times pdf
12-Mar eigenvalues, expansion, and rapid mixing pdf Approximating the permanent

Course overview

Metaphysical. Randomness is a powerful and ubiquitous tool in algorithm design and data analysis. This is especially true in a world overrun by data. No efficient algorithm can possibly take a high-fidelity view of all of it. But often we don't have to; uncertainty plays the dual roles of blessing and curse. And in our darkest high-dimensional hour, "concentration of measure" allows us to extract information without first knowing where to look.

Sometimes the signal is everywhere. Sometimes we only need a glimpse of the truth to head in the right direction. Sometimes a random direction suffices.

Evaluation and grading

  • weekly/bi-weekly homeworks (60%)
  • two take-home exams (40%)

Syllabus (approximate)

  • Discrete probability
    • The probabilistic method
    • Linearity of expectation
    • Variance and the 2nd moment method
    • Lovasz Local Lemma
    • Martingales
    • Chernoff and Hoeffding-Azuma inequalities
  • High-dimensional geometry and statistics
    • Concentration of measure
    • Dimension reduction
    • Locality sensitive hashing
    • Nearest-neighbor search
    • Restricted isometry and compressive sensing
  • Information and entropy
    • Measures of information
    • Max-entropy approximation
    • Sparsification via sampling
    • Multiplicative weights update and boosting
    • Online stochastic gradient descent
  • Markov chains and convergence to equilibrium
    • Random walks
    • Eigenvalues and rapid mixing
    • Expander graphs
    • Markov chain Monte Carlo
    • Approximate counting

Where & when:
Tues & Thurs, 10:30-11:50
EEB 025 on 15-Jan
EEB 037 afterward

[Late HW policy: I will allow one (slightly) late homework per person (e.g., to accomodate for a paper deadline). This must be arranged beforehand.]

[ Dropbox for assignments ]

  • HW #1, due 22-Jan
  • HW #2, out 20-Jan, due 3-Feb
  • HW #3, out 3-Feb, due 12-Feb
  • HW #4, out 19-Feb, due 5-Mar
  • HW #5, out 5-Mar, due 12-Mar


  • Midterm exam
    Out: Thursday, 12-Feb
    Due: Thursday, 19-Feb, 11:59pm
    No class on Tuesday, 17-Feb.
  • Final exam
    Out: Thursday, 12-Mar
    Due: Thursday, 19-Mar