Papers of the day   All papers



Aug 14 2019 Sean Welleck

our new paper: "Neural Text d̶e̶Generation with Unlikelihood Training" is now on arxiv! (w/ @uralik1, @stephenroller, Emily Dinan, @kchonyc, @jaseweston) A step towards solving the case of neural text degeneration 🔎
7 replies, 244 likes

Sep 16 2019 Sean Welleck

code and pre-trained models for "Neural Text Generation with Unlikelihood Training" now available! - Train and fine-tune LMs with unlikelihood - 🚨fine-tune a GPT-2 model from pytorch-transformers with unlikelihood
0 replies, 155 likes

Oct 02 2019 Ilia Kulikov

💡Update on "Neural Text Generation with Unlikelihood Training" !💡 new: - beam+ngram blocking & nucleus sampling in the human evaluation - analysis of token generation frequency distributions (with examples!) arxiv: w/ @wellecks
0 replies, 123 likes

Aug 14 2019 Kyunghyun Cho

since and if we know there are problems that we don't necessarily talk about, let's try to tackle one problem at a time, and let us, @uralik1, @wellecks, @jaseweston, @stephenroller and Emily Dinan, take one step for now. thanks to @YejinChoinka and @AlecRad and their (cont)
2 replies, 72 likes

Oct 02 2019 jaseweston

Unlikelihood training beats nucleus sampling and beam blocking for LM generation (new human eval results added on arXiv paper!)
0 replies, 69 likes

Aug 14 2019 Thomas Lahore

Neural Text Generation with Unlikelihood Training "We propose a new objective, unlikelihood training, which forces unlikely generations to be assigned lower probability by the model."
0 replies, 18 likes

Aug 14 2019 Ilya Kulikov

our new work is on arxiv! w/ @wellecks @kchonyc @jaseweston @stephenroller Emily Dinan!
0 replies, 15 likes

Oct 02 2019 Sean Welleck

and results on fine-tuning GPT-2 with unlikelihood!
1 replies, 7 likes