Rahul Nadkarni

CSE Ph.D. student, University of Washington

M.S. CSE, University of Washington, 2017
B.S. EECS & Bioengineering, UC Berkeley, 2015

rahuln [at] cs.uw.edu
Curriculum Vitae
Google Scholar
LinkedIn
GitHub

I am a Ph.D. student in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, advised by Noah Smith and a member of the ARK research group. In the past, I worked with Emily Fox on statistical machine learning methods for time series applied to neuroimaging data. I was fortunate to have been funded by an IGERT fellowship in Big Data and Data Science from 2017-2019. My undergraduate research was in the Brain-Machine Interface Systems Lab at UC Berkeley, advised by Jose Carmena.

My research interests are broadly in the areas of natural language processing and machine learning. I'm currently working on parameter-efficient methods for domain adaptation of large pretrained language models, and have previously worked on using large pretrained language models for scientific knowledge graph completion and knowledge discovery.

Publications

Conference papers

Binding Language Models in Symbolic Languages
Zhoujun Cheng*, Tianbao Xie*, Peng Shi, Chengzu Li, Rahul Nadkarni, Yushi Hu, Caiming Xiong, Dragomir Radev, Mari Ostendorf, Luke Zettlemoyer, Noah A. Smith, Tao Yu
International Conference on Learning Representations (ICLR), 2023
paper arXiv
Scientific Language Models for Biomedical Knowledge Base Completion: An Empirical Study
Rahul Nadkarni, David Wadden, Iz Beltagy, Noah A. Smith, Hannaneh Hajishirzi, and Tom Hope
Automated Knowledge Base Construction (AKBC), 2021
paper arXiv video

Meeting abstracts

Dynamic functional connectivity in auditory attention task
Jordan Drew, Eric Larson, Nicholas Foti, Rahul Nadkarni, Emily Fox, Adrian KC Lee
The Journal of the Acoustical Society of America, 2021
abstract

Workshop papers

A hierarchical state-space model with Gaussian process dynamics for functional connectivity estimation
Rahul Nadkarni, Nicholas J. Foti, Adrian KC Lee, and Emily B. Fox
NeurIPS Workshop on Learning Meaningful Representations of Life, 2019
abstract poster
Learning dynamic functional connectivity networks from infant magnetoencephalography data
Rahul Nadkarni, Nicholas J. Foti, and Emily B. Fox
NeurIPS BigNeuro Workshop, 2017
abstract poster
Sparse plus low-rank graphical models of time series for functional connectivity in MEG
Nicholas J. Foti, Rahul Nadkarni, Adrian KC Lee, and Emily B. Fox
SIGKDD Workshop on Mining and Learning from Time Series, 2016
paper slides talk

Preprints & Tech reports

Robust recovery of time-varying functional connectivity in MEG
Rahul Nadkarni, Nicholas J. Foti, and Emily B. Fox
paper
Sparse plus low-rank graphical models of time series to infer functional connectivity from MEG recordings
Rahul Nadkarni, Nicholas J. Foti, Adrian KC Lee, and Emily B. Fox
paper

Professional Experience

Facebook
Software Engineer Intern, Machine Learning (Ph.D.)
June – September 2021
Google
Software Engineering Intern, Ph.D.
June – September 2017

Graduate Coursework

Autumn 2015 Statistical Inference (STAT 512)
Winter 2016 Graphical Models (CSE 515), Natural Language Processing (CSE 517)
Spring 2016 Machine Learning for Big Data (CSE 547)
Autumn 2016 Convex Optimization (EE 578), Databases (CSE 544)
Winter 2017 Computational Neuroscience (CSE 528), Algorithms (CSE 521)
Spring 2017 Computer Vision (CSE 576)
Winter 2018 Online and Adaptive Methods for Machine Learning (CSE 599I)

Teaching Experience

Autumn 2015 Teaching Assistant, Data Structures and Algorithms (CSE 373)
Winter 2016 Teaching Assistant, Data Structures and Algorithms (CSE 373)
Spring 2016 Teaching Assistant, Introduction to Artificial Intelligence (CSE 415)
Autumn 2020 Teaching Assistant, Introduction to Artificial Intelligence (CSE 415)

Service

Reviewer      NeurIPS 2019, 2020, 2021; ICLR 2021; JAIR 2021, 2023