Noah Smith designs data-driven algorithms for automated analysis of
His work tackles the core
problems of natural language processing: parsing sentences in different languages into
MSX '09, AMBDS '16)
and semantic representations (DSCS '10; FTDCS '14, SBDS '16), as well as cross-cutting techniques for unsupervised
language learning (SE '05; CS '09). His 2011 book, Linguistic Structure
Prediction, synthesizes many statistical modeling techniques
Some of the methods he has contributed recently include conditional random
field autoencoders (ADS '14),
linguistic regularizers (YS '14),
alternating directions dual decomposition
retrofitting (FDJDHS '15),
recurrent neural network grammars (DKBS '16), entity language models (JTMCS '17),
scaffolds (STLZDS '18),
and rational recurrences (PSTS '18).
Such methods advance applications for automatic translation
At CMU, I taught courses on NLP at the undergraduate and graduate levels,
including an course originally called "Language and
Statistics II" and later "Structured Prediction for Language and Other
Discrete Data." Once I taught the graduate course "Probabilistic
Graphical Models." I regularly led advanced seminars and lab courses
on NLP. In 2013, students in the lab developed open-source morphology tools
(Assyrian) and Babylonian,
(the author of each is credited at the Github or Bitbucket site).
At various times, I co-taught with
Cohen, Chris Dyer,
Bob Frederking, and
My academic group is Noah's
ARK; we are part of the larger UW NLP fleet.
UW NLP is a great place to do NLP research! If you rely on automatic "metrics-based" rankings sites to learn where NLP research is happening, you might miss this fact. In particular, the maintainers of CS rankings dot org have taken it on themselves to categorize NLP researchers as academics or not. As a result, several of my colleagues and I have been excluded from the list of UW faculty because we have external, non-academic affiliations. (There are, of course, many shades of shared affiliation, and the site draws an arbitrary line across a complicated space, without a clear explanation or a careful effort to apply the rule evenly.) There's a lot to be said about this issue (perhaps I'll write something about it soon), but please note that the UW NLP group is thriving and, in my opinion, one of the best places in the world to be an NLP researcher.