James R. Lee

James R. Lee

Associate Professor
Computer Science
University of Washington

Paul G. Allen Center, Room 640
jrl [at] cs [dot] washington [dot] edu


Autumn 2015: Foundations of Computing I
Spring 2015: Foundations of Computing I
Winter 2015: Randomized algorithms

Associate Editor, SIDMA.
Associate Editor, SICOMP.
I was on the program committees for
SODA 2014, ICALP 2014, and FOCS 2014.

Research interests:
Algorithms, complexity, the theory of computation. Geometry and analysis at the interface between the continuous and discrete. Probability and stochastic processes.

Students: Jeffrey Hon
Ben Eggers, Yuegi Sheng, Austrin Stromme

Postdocs: Ronen Eldan (now at Weizmann)

Curriculum Vitae

Talks and events:   [earlier | later]

Selected recent works:   [ click on authors for abstract; expand / collapse all ]

  • Chang's Lemma is a widely employed result in additive combinatorics. It gives optimal bounds on the dimension of the large spectrum of probability distributions on finite abelian groups. In this note, we show how Chang's Lemma and a powerful variant due to Bloom both follow easily from an approximation theorem for probability measures in terms of generalized Riesz products. The latter result involves no algebraic structure. The proofs are correspondingly elementary.

  • [credit: Bernd Sturmfels]

    We introduce a method for proving lower bounds on the efficacy of semidefinite programming (SDP) relaxations for combinatorial problems. In particular, we show that the cut, TSP, and stable set polytopes on n-vertex graphs are not the linear image of the feasible region of any SDP (i.e., any spectrahedron) of dimension less than 2n^c, for some constant c>0. This result yields the first super-polynomial lower bounds on the semidefinite extension complexity of any explicit family of polytopes.

    Our results follow from a general technique for proving lower bounds on the positive semidefinite rank of a matrix. To this end, we establish a close connection between arbitrary SDPs and those arising from the sum-of-squares SDP hierarchy. For approximating maximum constraint satisfaction problems, we prove that SDPs of polynomial-size are equivalent in power to those arising from degree-O(1) sum-of-squares relaxations. This result implies, for instance, that no family of polynomial-size SDP relaxations can achieve better than a 7/8-approximation for MAX-3-SAT.

    [ PPT slides ]
    Best paper award, STOC 2015.

  • Consider a discrete-time martingale {Xt} taking values in a Hilbert space. We show if E[|Xt+1-Xt|2] = 1 and |Xt+1-Xt| ≤ L are satisfied for times t ≥ 0, then {Xt} satisfies a small-ball estimate: P[|Xt| < R] ≤ O(R/t1/2). Following [Lee-Peres 2013], this has applications to diffusive estimates for random walks on vertex-transitive graphs.

  • We show that under the Ornstein-Uhlenbeck semigroup (i.e., the natural diffusion process) on n-dimensional Gaussian space, any nonnegative, measurable function exhibits a uniform tail bound better than that implied by Markov's inequality and conservation of mass. This confirms positively the Gaussian limiting case of Talagrand's convolution conjecture (1989).

    Video: Talagrand's convolution conjecture and geometry via coupling (IAS)
    [ PPT slides ]

  • Recall the classical hypothesis testing setting with two convex sets of probability distributions P and Q. One receives either n i.i.d. samples from a distribution p in P or from a distribution q in Q and wants to decide from which set the points were sampled. It is known that the optimal exponential rate at which errors decrease can be achieved by a simple maximum-likelihood ratio test which does not depend on p or q, but only on the sets P and Q.

    We consider an adaptive generalization of this model where the choice of p in P and q in Q can change in each sample in some way that depends arbitrarily on the previous samples. We prove that even in this case, the optimal exponential error rate can be achieved by a simple maximum-likelihood test that depends only on P and Q.

    We then show that the adversarial model has applications in hypothesis testing for quantum states using restricted measurements. The basic idea is that in our setup, the deleterious effects of entanglement can be simulated by an adaptive classical adversary. We prove a quantum Stein's Lemma in this setting. Our arguments yield an alternate proof of Li and Winter's recent strengthening of strong subadditivity for quantum relative entropy.

  • [Updated 18-Feb-2015]
    We prove super-polynomial lower bounds on the size of linear programming relaxations for approximation versions of constraint satisfaction problems. We show that for these problems, polynomial-sized linear programs are exactly as powerful as programs arising from a constant number of rounds of the Sherali-Adams hierarchy. In particular, any polynomial-sized linear program for Max Cut has an integrality gap of 1/2 and any such linear program for Max 3-Sat has an integrality gap of 7/8.

    Video: Linear programming and constraint satisfaction (Simons Institute)

  • The classical Okamura-Seymour theorem states that for an edge-capacitated, multi-commodity flow instance in which all terminals lie on a single face of a planar graph, there exists a feasible concurrent flow if and only if the cut conditions are satisfied. Simple examples show that a similar theorem is impossible in the node-capacitated setting. Nevertheless, we prove that an approximate flow/cut theorem does hold: For some universal c > 0, if the node cut conditions are satisfied, then one can simultaneously route a c-fraction of all the demands. This answers an open question of Chekuri and Kawarabayashi. More generally, we show that this holds in the setting of multi-commodity polymatroid networks introduced by Chekuri, et. al. Our approach employs a new type of random metric embedding in order to round the convex programs corresponding to these more general flow problems.

    Notes: Planar multi-flows and L1 embeddings

  • A basic fact in spectral graph theory is that the number of connected components in an undirected graph is equal to the multiplicity of the eigenvalue zero in the Laplacian matrix of the graph. In particular, the graph is disconnected if and only if there are at least two eigenvalues equal to zero. Cheeger’s inequality and its variants provide a robust version of the latter fact; they state that a graph has a sparse cut if and only if there are at least two eigenvalues that are close to zero.

    It has been conjectured that an analogous characterization holds for higher multiplicities: There are k eigenvalues close to zero if and only if the vertex set can be partitioned into k subsets, each defining a sparse cut. We resolve this conjecture positively. Our result provides a theoretical justification for clustering algorithms that use the bottom k eigenvectors to embed the vertices into R^k, and then apply geometric considerations to the embedding. We also show that these techniques yield a nearly optimal quantitative connection between the expansion of sets of measure ≈ 1/k and the kth smallest eigenvalue of the normalized Laplacian.

    Notes: A no frills proof of the higher-order Cheeger inequality
    Related: One hundred hours of lectures from the SGT program at the Simons Institute.
    Related: Laurent Miclo uses the higher-order Cheeger inequality for the basis of his resolution of Hoegh-Krohn and Simon's conjecture that every hyperbounded operator has a spectral gap.

  • We show that if a metric space X threshold-embeds into a Hilbert space, then X has Markov type 2. As a consequence, planar graph metrics and doubling metrics have Markov type 2, answering questions of Naor, Peres, Schramm, and Sheffield. More generally, if a metric space X threshold-embeds into a p-uniformly smooth Banach space, then X has Markov type p. This suggests some non-linear analogs of Kwapien's theorem. For instance, a subset of L^1 threshold-embeds into Hilbert space if and only if it has Markov type 2.

  • We show that the cover time of a graph can be related to the square of the maximum of the associated Gaussian free field. This yields a positive answer to a question of Aldous and Fill (1994) on deterministic approximations to the cover time, and positively resolves the Blanket Time conjecture of Winkler and Zuckerman (1996).

    Video: Cover times of graphs and the Gaussian free field (Newton Institute)
    Notes: Majorizing measures and Gaussian processes
    Related questions and conjectures (all solved except the one after Lemma 4)

    See also the related preprint of Alex Zhai that resolves our main conjecture.

Some older selected papers:   [ click on authors for abstract; expand / collapse all ]

My research has been generously supported by the National Science Foundation, the Sloan Foundation, Microsoft, and the Simons Institute.