Zihao Ye (叶子豪)

Ph.D. student @ SAMPL, UW CSE
Bill & Melinda Gates Center, Room 330
Email : zhye [at] cs [dot] washington [dot] edu
Google Scholar : Zihao Ye
Github : @yzh119

About Me

This is Zihao Ye, a second-year Ph.D. student at the University of Washington’s Paul G. Allen School of Computer Science and Engineering, advised by Luis Ceze in the SAMPL research group. I also work closely with Tianqi Chen on Apache TVM project.

Prior to joining UW, I spent two years at AWS where I worked with Minjie Wang and Zheng Zhang. I obtained my bachelor’s degree from ACM Honors Class at Shanghai Jiao Tong University.

We are organizing talks at SAMPL, topics include Systems, Architecture, Compilers, Verification and Machine Learning.


I have broad interests in Computer Systems, Compiler, Programming Languages, and Computer Architecture. My current research centers around sparse computation:

Feel free to drop me an email if we have aligned interests, and I’m open to collaborations.

Current Projects


SparseTIR is a unified abstraction for representing and optimizing sparse/irregular workloads in Deep Learning on top of TVM Tensor IR. It aims to generate efficient code for various sparse formats on heterogeneous hardware.

Invited Talks

Composable Abstractions for Sparse Compilation in Deep Learning

Recent Publications

Preprint PDF SparseTIR: Composable Abstractions for Sparse Compilation in Deep Learning.
Zihao Ye, Ruihang Lai, Junru Shao, Tianqi Chen, and Luis Ceze.
arXiv preprint arXiv:2207.04606, 2022.

Activity and Service