Weihao Kong
Postdoctoral researcher
University of Washington
Email: kweihao at gmail dot com

I am currently a postdoctoral researcher at University of Washington working with Sham M. Kakade. I recently got my PhD from Stanford University advised by Gregory Valiant. My thesis can be found here. I got my bachelor degree from Shanghai Jiao Tong University and did undergraduate research with Wu-Jun Li.

Research interests:

Recent talks:

CV · Teaching · Talks


  1. Online Model Selection for Reinforcement Learning with Function Approximation
    Jonathan N. Lee, Aldo Pacchiano, Vidya Muthukumar, Weihao Kong, Emma Brunskill.
    In submission.

    A meta-algorithm that adapts to the optimal model complexity with regret bound $\tilde{O}(d_*T^{2/3})$.

  2. Robust Meta-learning for Mixed Linear Regression with Small Batches [Video]
    Weihao Kong, Raghav Somani, Sham M. Kakade, Sewoong Oh.
    NeurIPS 2020

    Smooth time/sample complexity trade-off between the number of tasks and the dataset size of each task. Algorithmic contribution includes an optimal robust PCA algorithm.

  3. Meta-Learning for Mixed Linear Regression [Video]
    Weihao Kong, Raghav Somani, Zhao Song, Sham M. Kakade, Sewoong Oh.
    ICML 2020

    First poly time/sample complexity algorithm for $k$-mixture linear regression by utilizing a small set of tasks each with $\tilde{O}(\sqrt{k})$ examples.

  4. Optimal Estimation of Change in a Population of Parameters
    Ramya Korlakai Vinayak, Weihao Kong, Sham M. Kakade.

    Estimate the distribution of treatment effects.

  5. Sublinear Optimal Policy Value Estimation in Contextual Bandits [Video]
    Weihao Kong, Gregory Valiant, Emma Brunskill.
    AISTATS 2020.

    Estimate optimal policy value of contextual bandits with $K$ disjoint $d$-dimensional arms in $\tilde{O} (\sqrt{d}K)$ rounds, minimax optimal up to log factors.

  6. Maximum Likelihood Estimation for Learning Populations of Parameters
    Ramya Korlakai Vinayak, Weihao Kong, Gregory Valiant, Sham M. Kakade.
    ICML 2019.

    $n$ different coins, each tossed $t$ times, estimate distribution of coin biases with Wasserstein error $O(1/\sqrt{t\log n})$ when $n<2^t$, minimax optimal.

  7. Efficient Algorithms and Lower Bounds for Robust Linear Regression
    Ilias Diakonikolas*, Weihao Kong*, Alistair Stewart*.
    SODA 2019

    Solve $\epsilon$ corrupted linear regression up to $\tilde{O}(\epsilon)$ error under gaussianity assumption, minimax optimal, previous work achieves $\sqrt{\epsilon}$.

  8. Estimating Learnability in the Sublinear Data Regime [Code]
    Weihao Kong, Gregory Valiant.
    NIPS 2018

    Estimate optimal prediction error of linear regression/classification with $O(\sqrt{d})$ examples, minimax optimal.

  9. Approximating the Spectrum of a Graph [Code][Video]
    David Cohen-Steiner*, Weihao Kong*, Christian Sohler*, Gregory Valiant*.
    KDD 2018.

    Approximate graph Laplacian specturm in constant time.

  10. Recovering Structured Probability Matrices
    Qingqing Huang*, Sham M. Kakade*, Weihao Kong*, Gregory Valiant*.
    ITCS 2018.
  11. Learning Populations of Parameters [Video]
    Kevin Tian, Weihao Kong, Gregory Valiant.
    NIPS 2017.

    $n$ different coins, each tossed $t$ times, estimate distribution of coin biases with Wasserstein error $O(1/{t})$ when $n\ge 2^t$, minimax optimal.

  12. Spectrum Estimation from Samples [Code]
    Weihao Kong, Gregory Valiant.
    Annals of Statistics, 2017.

    First algorithm consistently estimates eigenvalues of covariance with $o(d)$ samples.

  13. Optimal Allocation for Chunked-Reward Advertising
    Weihao Kong, Jian Li, Tao Qin, Tie-Yan Liu.
    WINE 2013.
  14. Isotropic Hashing [Code]
    Weihao Kong, Wu-Jun Li.
    NIPS 2012.[Poster]
  15. Manhattan Hashing for Large-Scale Image Retrieval [Code]
    Weihao Kong, Wu-Jun Li, Minyi Guo.
    SIGIR 2012.
  16. Double-bit quantization for hashing [Code]
    Weihao Kong, Wu-Jun Li.
    AAAI 2012.
(asterisk indicates alphabetical authorship)