Mandar Joshi | मंदार जोशी

(pronounced Mun-daar Joe-shi)

I am a final year PhD student at the Paul G. Allen School of Computer Science and Engineering at the University of Washington, Seattle. I work in the area of natural language processing, and I am advised by Luke Zettlemoyer and Dan Weld. Much of my recent focus has been on building large-scale self-supervised models and applying them to downstream tasks such as question-answering and summarization.

Before coming to the UW, I was a research engineer at IBM Research, Bangalore where I worked on creating language technology for IBM's businesses. I received my Master's degree from Indian Institute of Technology, Bombay; Soumen Chakrabarti was my thesis adviser.


When I am not doing Computer Science-y stuff, I like to hike, dance, and play racket sports.


Publications

Armen Aghajanyan, Dmytro Okhonko, Mike Lewis, Mandar Joshi, Hu Xu, Gargi Ghosh, Luke Zettlemoyer. HTLM: Hyper-Text Pre-Training and Prompting of Language Models. ArXiv 2107.06955, 2021.
Arie Cattan, Alon Eirew, Gabriel Stanovsky, Mandar Joshi, Ido Dagan. Cross-document Coreference Resolution over Predicted Mentions. ACL 2021 Findings (Short).
Arie Cattan, Alon Eirew, Gabriel Stanovsky, Mandar Joshi, Ido Dagan. Streamlining Cross-Document Coreference Resolution: Evaluation and Modeling. ArXiv 2009.11032, 2020.
Terra Blevins, Mandar Joshi, Luke Zettlemoyer. FEWS: Large-Scale, Low-Shot Word Sense Disambiguation with the Dictionary. EACL 2021.
Bhargavi Paranjape, Mandar Joshi, John Thickstun, Hannaneh Hajishirzi, Luke Zettlemoyer. An Information Bottleneck Approach to Controlling Conciseness in Rationale Extraction. EMNLP, 2020.
[ code ]
Mandar Joshi, Kenton Lee, Yi Luan, Kristina Toutanova. Contextualized Representations Using Textual Encyclopedic Knowledge. ArXiv:2004.12006, 2020.
Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, Veselin Stoyanov. RoBERTa: A Robustly Optimized BERT Pretraining Approach . ArXiv:1907.11692, 2019.
[ code ]
Mandar Joshi*, Danqi Chen*, Yinhan Liu, Daniel S. Weld, Luke Zettlemoyer, Omer Levy. SpanBERT: Improving Pre-training by Representing and Predicting Spans. TACL, 2019.
* Equal Contribution
[ code ]
Mandar Joshi, Omer Levy, Daniel S. Weld, Luke Zettlemoyer. BERT for Coreference Resolution: Baselines and Analysis. (Short) Proceedings of Emperical Methods in Natural Language Processing (EMNLP), 2019.
[ code ]
Mandar Joshi, Eunsol Choi, Omer Levy, Daniel S. Weld, Luke Zettlemoyer. pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference. Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2019.
[ code ]
Mandar Joshi, Eunsol Choi, Daniel Weld, Luke Zettlemoyer. TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension. Association for Computational Linguistics (ACL) 2017.
[ website ] [ bib ]
Mandar Joshi, Uma Sawant, Soumen Chakrabarti. Knowledge Graph and Corpus Driven Segmentation and Answer Inference for Telegraphic Entity-seeking Queries. Empirical Methods in Natural Language Processing (EMNLP) 2014.
[ slides ]
Mandar Joshi, Rakesh Khobragade, Saurabh Sarda, Umesh Deshpande, Shiwali Mohan. Object-Oriented Representation and Hierarchical Reinforcement Learning in Infinite Mario. (Short) IEEE International Conference on Tools with Artificial Intelligence (ICTAI), 2012.

Contact

Paul G. Allen Centre for Computer Science and Engineering
The University of Washington
Seattle, WA

Email: mandar90[at]cs[dot]washington[dot]edu