Papers of the day   All papers

Training Language GANs from Scratch

Comments

May 27 2019 hardmaru

Training Language GANs from Scratch Latest work that attempts to train a language model entirely using GAN discriminator's loss function rather than maximum likelihood loss. Large population batch size and dense reward signal make REINFORCE less unhappy. https://arxiv.org/abs/1905.09922 https://t.co/7058pHP1nU
2 replies, 510 likes


May 27 2019 William Fedus

Something near and dear to my heart, text GANs from scratch. de Masson d'Autume et al. (2019) show that by using a dense reward structure as in MaskGAN, discriminator regularization and larger batch sizes to reduce variance that comparable performance to MLE can be achieved.
1 replies, 21 likes


May 27 2019 roadrunner01

Training Language GANs from Scratch pdf: https://arxiv.org/pdf/1905.09922.pdf abs: https://arxiv.org/abs/1905.09922 https://t.co/zUwtrrVa8d
1 replies, 20 likes


May 31 2019 Sasha Rush

Mihaela notes that they are getting a lot better at training RL-based Text GANs with all the latest GAN tricks: https://arxiv.org/abs/1905.09922?context=stat.ML
0 replies, 15 likes


May 28 2019 arxiv

Training language GANs from Scratch. http://arxiv.org/abs/1905.09922 https://t.co/N6U9i2DMfl
0 replies, 6 likes


Jun 16 2019 Andriy Burkov

We show it is in fact possible to train a language GAN from scratch -- without maximum likelihood pre-training. We combine existing techniques such as large batch sizes, dense rewards and discriminator regularization to stabilize and improve language GANs https://arxiv.org/abs/1905.09922
0 replies, 1 likes


May 28 2019 BLACKSTEM Global💡🔬🌼

If this diagram reads like gibberish to you, it's time to take a summer class to refresh your technical skills. Check @udemy , @coursera , @khanacademy , @MITxonedX , and other online training program sites to sign up for classes that you can attend. REGISTER TODAY!
0 replies, 1 likes


Content