Papers of the day   All papers

Prescribed Generative Adversarial Networks

Comments

Oct 31 2019 DeepMind

GANs are powerful generative models but they suffer from mode collapse and are hard to evaluate on test data. We developed PresGANs to address these two limitations: https://arxiv.org/abs/1910.04302 https://t.co/bJJQInCqjH
5 replies, 653 likes


Nov 01 2019 hardmaru 😷

Prescribed Generative Adversarial Networks Adding noise to the generator's output can help prevent mode collapse common in GANs, and also allows approximate log-likelihood evaluation. It's like killing two birds with one stone! By @adjiboussodieng et al. https://arxiv.org/abs/1910.04302 https://t.co/LIQwD9Hi99
2 replies, 201 likes


Oct 13 2019 Adji Bousso Dieng

Excited to head to LA (for the first time!) for this IPAM workshop http://www.ipam.ucla.edu/programs/workshops/workshop-ii-interpretable-learning-in-physical-sciences/?tab=schedule . I will talk about structure + deep generative models. I will later be @UCBerkeley to discuss my latest work on Prescribed GANs (https://arxiv.org/abs/1910.04302).
3 replies, 82 likes


Oct 11 2019 roadrunner01

Prescribed Generative Adversarial Networks pdf: https://arxiv.org/pdf/1910.04302.pdf abs: https://arxiv.org/abs/1910.04302 github: https://github.com/adjidieng/PresGANs https://t.co/gxlvWeAjZ3
1 replies, 24 likes


Oct 15 2019 Anima Anandkumar (hiring)

Slides http://tensorlab.cms.caltech.edu/users/anima/slides/IAS2019.pdf New optimization: competitive gradient descent (CGD) for training GAN/multi-agent systems. Implicit competitive regularization from CGD means that we get SOTA with no explicit gradient penalty, better stability and no mode collapse #AI #DeepLearning https://twitter.com/RahelJhirad/status/1184134215463526401
1 replies, 17 likes


Oct 31 2019 Jade Abbott

Amazing work from @adjiboussodieng and the DeepMind team! ✨🤩
0 replies, 15 likes


Oct 13 2019 arxiv

Prescribed Generative Adversarial Networks. http://arxiv.org/abs/1910.04302 https://t.co/8eKsA4iXMf
0 replies, 8 likes


Oct 31 2019 Awa

Amazing work from @adjiboussodieng
0 replies, 3 likes


Oct 31 2019 Adji Bousso Dieng

@deliprao @Miles_Brundage We made that same observation when working on our PresGAN paper. There were many repetitions in the generated samples from StyleGAN---but this is the case for most GANs because of mode collapse. I recommend reading this https://arxiv.org/abs/1910.04302 ...
0 replies, 3 likes


Nov 01 2019 Brent Arnichec

RT DeepMindAI: GANs are powerful generative models but they suffer from mode collapse and are hard to evaluate on test data. We developed PresGANs to address these two limitations: https://arxiv.org/abs/1910.04302 https://t.co/b1sYjDndwM
0 replies, 1 likes


Oct 31 2019 Where To Buy AI

RT DeepMindAI: GANs are powerful generative models but they suffer from mode collapse and are hard to evaluate on test data. We developed PresGANs to address these two limitations: https://arxiv.org/abs/1910.04302 https://t.co/EwoX9DkbV5
0 replies, 1 likes


Content