![]() |
Joshua P. Gardner |
I am currently pursuing a PhD in computer science at the University of Washington's Paul G. Allen School of Computer Science & Engineering, where I am fortunate to be advised by Ludwig Schmidt and Zoran Popović . I hold an M.S. in Applied Statistics and an M.S. in Information Science from the University of Michigan. I also hold a B.A. with Highest Honors in Philosophy from the University of Michigan.
My research focuses on empirical machine learning: characterizing the conditions under which modern machine learning models succeed and fail, and using this understanding to develop improved methods for machine learning. This includes understanding models' overall performance, but also their robustness to distribution shift, fairness, and privacy, with the aim of using this empirical understanding to select or design new methods that address these limitations. I have studied a diverse set of domains and applications under this general theme, including tabular and structured data; multimodal learning; music and audio (e.g. transcription, synthesis); and federated and collaborative learning.
Currently (Summer 2023) I am a Research Scientist Intern at Spotify Research. Previously, I was fortunate to spend just shy of two years as a Research Intern + Student Researcher on the Magenta team at Google Brain, working on machine learning problems in the music and audio domain, including MT3 (see additional publications here).
Awards and honors for my past work include a Best Paper Award at the International Conference on Learning Analytics and Knowledge (LAK), the Margaret Mann Award, the UMSI Professional Practice Fellowship, and the William K. Frankena Prize.
Selected Publications
For a full list of publications see my research page or Google Scholar profile.
-
Benchmarking Distribution Shift in Tabular Data with TableShift.
Josh Gardner, Zoran Popović, Ludwig Schmidt.
To appear in: Neural Information Processing Systems (NeurIPS) 2023 (Datasets & Benchmarks Track).
[code] [web]
-
Cross-Institutional Transfer Learning for Educational Models: Implications for Model Performance, Fairness, and Equity.
Josh Gardner, Renzhe Yu, Quan Nguyen, Christopher Brooks, Rene Kizilcec.
ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT) 2023.
[pdf] [arxiv] [code]
-
Subgroup Robustness Grows on Trees: An Empirical Baseline Study
Josh Gardner, Zoran Popović, Ludwig Schmidt.
Neural Information Processing Systems (NeurIPS) 2022.
[arxiv] [code]
-
OpenFlamingo: An Open-Source Framework for Training Vision-Language Models with In-Context Learning
Anas Awadalla, Irena Gao, Josh Gardner, Jack Hessel, Yusuf Hanafy, Wanrong Zhu, Kalyani Marathe, Yonatan Bitton, Samir Gadre, Jenia Jitsev, Simon Kornblith, Pang Wei Koh, Gabriel Ilharco, Mitchell Wortsman, Ludwig Schmidt.
[arxiv] [blog] [code]
-
MT3: Multi-Task Multitrack Music Transcription
Josh Gardner, Ian Simon, Ethan Manilow, Curtis Hawthorne, Jesse Engel.
International Conference on Learning Representations (ICLR) 2022.
Spotlight Presentation (top 6.7% of submissions)
[arxiv] [web] [blog] [code]
-
Evaluating the Fairness of Predictive Student Models Through Slicing Analysis
Josh Gardner, Christopher Brooks, and Ryan Baker.
International Conference on Learning Analytics and Knowledge (LAK) 2019.
Best Paper Award
[pdf]