Zihao Ye (叶子豪)

Ph.D. student @ SAMPL, UW CSE
Bill & Melinda Gates Center, Room 330
Email : zhye [at] cs [dot] washington [dot] edu
Google Scholar : Zihao Ye
Github : yzh119

About Me

This is Zihao Ye, a third-year Ph.D. student at the University of Washington’s Paul G. Allen School of Computer Science and Engineering, advised by Luis Ceze in the SAMPL research group. I also work closely with Tianqi Chen on Apache TVM project.

Prior to joining UW, I spent two years at AWS where I worked with Minjie Wang and Zheng Zhang. I obtained my bachelor’s degree from ACM Honors Class at Shanghai Jiao Tong University.

We are organizing talks at SAMPL, topics include Systems, Architecture, Compilers, Verification and Machine Learning.

Besides research, I enjoy diving into software/hardware details, as well as working on open-source projects.


I have broad interests in Computer Systems, Compiler, Programming Languages, and Computer Architecture. My current research centers around sparse computation:

Feel free to drop me an email if we have aligned interests, and I’m open to collaborations.


Current Projects

Compiler for Sparsity in Deep Learning

SparseTIR is a tensor-level abstraction for representing and optimizing sparse/irregular operators in Deep Learning. The project is in close collaboration with Ruihang and Tianqi from CMU Catalyst and Junru from OctoML, we thank the support and advice from advisors TQ and Luis, and assistance from the TensorIR team.


Composable Abstractions for Sparse Compilation in Deep Learning

Selected Publications

ASPLOS 2023 PDF SparseTIR: Composable Abstractions for Sparse Compilation in Deep Learning.
Zihao Ye, Ruihang Lai, Junru Shao, Tianqi Chen, and Luis Ceze.
The 28th ACM International Conference on Architectural Support for Programming Languages and Operating Systems, 2023. Distinguished Artifact Award.

Activity and Service