Reflective Decoding: Unsupervised Paraphrasing and Abductive Reasoning

by Peter West, Ximing Lu, Ari Holtzman, Chandra Bhagavatula, Jena Hwang, Yejin Choi

Reflective Decoding uses generated contexts as an intermediate meaning representation, with multiple applications

We present a novel, unsupervised method to extend LM decoding to new domains. Given an input, Reflective Decoding samples contexts that capture its meaning, then uses reverse-direction LMs to reflect back from context into text! Reflective Decoding is simple and intuitive to apply, using only a forward and a backward language model -- no tuning required!

[paper] [code] [video] [models] [contact]