Sham M. Kakade

Washington Research Foundation Data Science Chair

Associate Professor in both:
Department of Computer Science
Department of Statistics

Senior Data Science Fellow,
eScience Institute

Adjunct Professor in:
Department of Electrical Engineering

University of Washington


Algorithmic Foundations of Data Science Institute
UW got a NSF Tripods grant! We started an institute on the theoretical foundations of data science. Also, we received more support for our data science efforts.



I study the theoretical foundations of machine learning and AI in a broad range of paradigms, and I enjoy collaborating with a diverse set of researchers to tackle these issues. Understanding these foundations is central in our design of algorithms which are efficient from both computational and statistical perspectives. I seek to use these advancements to help in furthering the state of the art in various AI domains.


Sham Kakade is a Washington Research Foundation Data Science Chair, with a joint appointment in the Department of Computer Science and the Department of Statistics at the University of Washington. He works on the theoretical foundations of machine learning, focusing on designing provable and practical statistically and computationally efficient algorithms. Amongst his contributions, with a diverse set of collaborators, are: establishing principled approaches in reinforcement learning (including the natural policy gradient, conservative policy iteration, and the PAC-MDP framework); optimal algorithms in the stochastic and non-stochastic multi-armed bandit problems (including the widely used linear bandit and the Gaussian process bandit models); computationally and statistically efficient tensor decomposition methods for estimation of latent variable models (including estimation of mixture of Gaussians, latent Dirichlet allocation, hidden markov models, and overlapping communities in social networks); faster algorithms for large scale convex and nonconvex optimization (including how to escape from saddle points efficiently). He is the recipient of the IBM Goldberg best paper award (in 2007) for contributions to fast nearest neighbor search and the best paper, INFORMS Revenue Management and Pricing Section Prize (2014). He has been program chair for COLT 2011.
Sham completed his Ph.D. at the Gatsby Computational Neuroscience Unit at University College London, under the supervision of Peter Dayan, and he was a postdoc at the Dept. of Computer Science, University of Pennsylvania , under the supervision of Michael Kearns. Sham was an undergraduate at Caltech , studying in physics under the supervision of John Preskill. Sham has been a Principal Research Scientist at Microsoft Research, New England, an associate professor at the Department of Statistics, Wharton, UPenn, and an assistant professor at the Toyota Technological Institute at Chicago.



Discovering Latent Structure in Societal-Scale Data  

Activities and Services

Committee for the Sloan Research Fellowships in Computer Science (active).
Co-organizer for the Simons Foundations of Machine Learning, Winter, 2017
Co-chair for the Simon's Representational Learning workshop, March, 2017
Co-chair for the IMS-MSR Workshop: Foundations of Data Science, June 11th, 2015.
Steering committee for the fourth New England Machine Learning Day, May 18th, 2015.
Program committee for the third New England Machine Learning Day, May 13th, 2014.
Co-chair for New York Computer Science and Economics Day V, Dec 3rd, 2012.
Program committee for the first New England Machine Learning Day, May 16th, 2012.
Program chair for the 24th Annual Conference on Learning Theory (COLT 2011) which took place in Budapest, Hungary, on July 9-11, 2011.


Tensor Decompositions for Learning Latent Variable Models, AAAI 2014
Tensor Decompositions Methods for Learning Latent Variable Models, ICML 2013

Course Links

CSE 547 / STAT 548: Machine Learning for Big Data, Spring 2018
CSE 446: Machine Learning, Winter 2018
CSE 547 / STAT 548: Machine Learning for Big Data, Spring 2017
CSE 546: Machine Learning, Autumn 2016
CSE 547 / STAT 548: Machine Learning for Big Data, Spring 2016
CSE 546: Machine Learning, Autumn 2015
Stat 928: Statistical Learning Theory
Stat 991: Multivariate Analysis, Dimensionality Reduction, and Spectral Methods
Large Scale Learning
Learning Theory

Current Postdocs

Ramya Korlakai Vinayak

Current Students

Gabriel Cadamuro (co-advised with Josh Blumenstock)
Rahul Kidambi
Krishna Pillutla (co-advised with Zaid Harchaoui)
Aravind Rajeswaran (co-advised with Emo Todorov)
John Thickstun (co-advised with Zaid Harchaoui)

Former Postdocs

Praneeth Netrapalli
Rong Ge
Daniel Hsu

Former Interns (in reverse chronological order)

Chi Jin
Aaron Sidford
Roy Frostig
David Belanger
Chen Wang
Qingqing Huang
Jaehyun Park
Karl Stratos
Do-kyum Kim
Praneeth Netrapalli
Rashish Tandon
Rong Ge
Adel Javanmard
Matus Telgarsky
Daniel Hsu (while at TTI-C)
Sathyanarayan Anand (while at TTI-C)

Contact Info

Email: sham [at] cs [dot] washington [dot] edu

CSE office address:
Computer Science & Engineering, Office 436
Paul Allen Center

Stat office address:
Department of Statistics, Office 303
Padelford Hall

University of Washington
Seattle, WA 98195