Hao Peng is a Ph.D. candidate in Paul G. Allen School of Computer Science and Engineering at the University of Washington, advised by Prof. Noah Smith. His research interests include natural language processing and machine learning. Before coming to UW, he was an undergraduate at Peking University. He also interned at DeepMind, Google AI, Microsoft Research Asia and the University of Edinburgh. He is supported by a Google Fellowship.

I'm on the job market, looking for tenure-track faculty and industrial research positions.

Email: hapeng at cs.washington.edu

[CV] [Google Scholar]


Publications

ABC: Attention with Bounded-memory Control
Hao Peng, Jungo Kasai, Nikolaos Pappas, Dani Yogatama, Zhaofeng Wu, Lingpeng Kong, Roy Schwartz, and Noah A. Smith
Preprint, 2021

Tailor: Generating and Perturbing Text with Semantic Controls
Tongshuang Wu, Alexis Ross, Hao Peng, Matthew E. Peters, and Matt Gardner
Preprint, 2021

Finetuning Pretrained Transformers into RNNs
Jungo Kasai, Hao Peng, Yizhe Zhang, Dani Yogatama, Gabriel Ilharco, Nikolaos Pappas, Yi Mao, Weizhu Chen, and Noah A. Smith
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
[bib]

Random Feature Attention
Hao Peng, Nikolaos Pappas, Dani Yogatama, Roy Schwartz, Noah A. Smith, and Lingpeng Kong
In Proceedings of the International Conference on Learning Representations (ICLR), 2021
Spotlight
[bib] [code] [slides] [poster]

Deep Encoder, Shallow Decoder: Reevaluating the Speed-Quality Tradeoff in Machine Translation
Jungo Kasai, Nikolaos Pappas, Hao Peng, James Cross, and Noah A. Smith
In Proceedings of the International Conference on Learning Representations (ICLR), 2021
[bib] [code by Jungo]

Contextualized Perturbation for Textual Adversarial Attack
Dianqi Li, Yizhe Zhang, Hao Peng, Liqun Chen, Chris Brockett, Ming-Ting Sun, and Bill Dolan
In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2021
[bib] [code by Dianqi]

Infusing Finetuning with Semantic Dependencies
Zhaofeng Wu, Hao Peng, and Noah A. Smith
Transactions of the Association for Computational Linguistics (TACL), 2020
[bib] [code by Zhaofeng]

A Mixture of h − 1 Heads is Better than h Heads
Hao Peng, Roy Schwartz, Dianqi Li, and Noah A. Smith
In Proceedings of the Association for Computational Linguistic (ACL), 2020
[bib] [code] [slides]

PaLM: A Hybrid Parser and Language Model
Hao Peng, Roy Schwartz, and Noah A. Smith
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019
[bib] [code]

RNN Architecture Learning with Sparse Regularization
Jesse Dodge, Roy Schwartz, Hao Peng, and Noah A. Smith
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019
[bib] [code by Jesse]

Text Generation with Exemplar-based Adaptive Decoding
Hao Peng, Ankur P. Parikh, Manaal Faruqui, Bhuwan Dhingra, and Dipanjan Das
In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2019
[bib] [slides] [code]

Rational Recurrences
Hao Peng, Roy Schwartz, Sam Thomson, and Noah A. Smith
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018
[bib] [slides] [code]

Backpropagating through Structured Argmax using a SPIGOT
Hao Peng, Sam Thomson, and Noah A. Smith
In Proceedings of the Association for Computational Linguistic (ACL), 2018
Best Paper Honorable Mention
[bib] [slides] [code]

Learning Joint Semantic Parsers from Disjoint Data
Hao Peng, Sam Thomson, Swabha Swayamdipta, and Noah A. Smith
In Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2018
[bib] [slides]

"You are no Jack Kennedy": On Media Selection of Highlights from Presidential Debate
Chenhao Tan, Hao Peng, and Noah A. Smith
In Proceedings of The Web Conference (WWW), 2018
[bib] [data]

Deep Multitask Learning for Semantic Dependency Parsing
Hao Peng, Sam Thomson, and Noah A. Smith
In Proceedings of the Association for Computational Linguistic (ACL), 2017
[bib] [code] [poster]


Older Publications

News Citation Recommendation with Implicit and Explicit Semantics
Hao Peng, Jing Liu, and Chin-Yew Lin
In Proceedings of the Association for Computational Linguistics (ACL), 2016
[bib]

A Convolutional Attention Network for Extreme Summarization of Source Code
Miltiadis Allamanis, Hao Peng, and Charles Sutton
In Proceedings of the International Conference on Machine Learning (ICML), 2016
[bib] [code by Miltos] [data]

Discriminative Neural Sentence Modeling by Tree-based Convolution
Lili Mou*, Hao Peng*, Ge Li, Yan Xu, Lu Zhang, and Zhi Jin. (*: Equal contribution)
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2015
[bib]

Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths
Yan Xu, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, and Zhi Jin
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2015
[bib]


Miscellany

Hao plays the violin, and enjoys traveling. He loves Gustav Mahler (see also: Mahleria) and Fyodor Dostoyevsky.