Papers of the day   All papers

YOUR CLASSIFIER IS SECRETLY AN ENERGY BASED MODEL AND YOU SHOULD TREAT IT LIKE ONE

Comments

David Duvenaud: Classifiers are secretly energy-based models! Every softmax giving p(c|x) has an unused degree of freedom, which we use to compute the input density p(x). This makes classifiers into generative models without changing the architecture. https://arxiv.org/abs/1912.03263 https://t.co/IzMPxiNxFQ

12 replies, 1590 likes


will grathwohl: if y'all r lookin for somethin to read while walking around Vancouver, check out my newest paper: "Your Classifier is Secretly an Energy-Based Model and You Should Treat it Like One" https://arxiv.org/abs/1912.03263 with @kcjacksonwang @jh_jacobsen @DavidDuvenaud @Mo_Norouzi @kswersk

3 replies, 162 likes


Alf @ NeurIPS: «Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One» by @wgrathwohl et al. Energy based training of the joint improves calibration, robustness, out-of-distribution detection, and generative quality. https://arxiv.org/abs/1912.03263 https://t.co/BcUaDO9Ghr

3 replies, 156 likes


Sasha Rush: One last #ICLR2020 data point: All posters were widely viewed relative to a live conference, (even if folks were a bit zoom shy). The median had ~200 unique views and even the 10th percentile was ~100 uniques. Max viewed poster (1000 uniques): https://iclr.cc/virtual/poster_Hkxzx0NtDB.html https://t.co/q3faDcnNYz

4 replies, 150 likes


Alf @ 𝑣ʳIPS: Since you seem interested in EBMs, let me share a few slides from @ylecun class. https://cs.nyu.edu/~yann/2004f-G22-3033-002/diglib/lecture04-backprop.pdf Slides 16 and 17 explain the equivalence between softargmax + log loss and EBM with negative log-likelihood loss. https://t.co/4lqUQoFZKw

1 replies, 82 likes


Mohammad Norouzi: Cross entropy loss is invariant to shifting logits by any constant. Our paper uses this extra degree of freedom to define an energy based generative model of input data. This improves calibration and adversarial robustness of the corresponding classifier.

0 replies, 28 likes


will grathwohl: Super excited to announce the release of the official implementation of JEM here: https://wgrathwohl.github.io/JEM/ Implementation for my latest paper: https://arxiv.org/abs/1912.03263 Major shout-out to my co-authors @kcjacksonwang @jh_jacobsen @DavidDuvenaud @Mo_Norouzi @kswersk

2 replies, 27 likes


roadrunner01: Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One pdf: https://arxiv.org/pdf/1912.03263.pdf abs: https://arxiv.org/abs/1912.03263 https://t.co/wYKPppe72X

0 replies, 25 likes


Kevin Swersky: My journey into machine learning research began with energy based models (RBMs). It’s been fun revisiting them! They can be quite powerful if trained well.

1 replies, 20 likes


MONTREAL.AI: Your Classifier is Secretly an Energy Based Model and You Should Treat it Like One Grathwohl et al.: https://arxiv.org/abs/1912.03263 #ArtificialIntelligence #DeepLearning #MachineLearning https://t.co/xq2rTN7g6E

0 replies, 11 likes


Sabrina J. Mielke: "Your classifier is secretly an energy based model and you should treat it like one" Will Grathwohl (@wgrathwohl), KC Wang (@kcjacksonwang), Joern-Henrik Jacobsen (@jh_jacobsen), @DavidDuvenaud, Mohammad Norouzi (@Mo_Norouzi), Kevin Swersky (@kswersk) https://twitter.com/wgrathwohl/status/1203848404717228033

1 replies, 9 likes


Guillaume Verdon @#Q2B19: EBMs are the new GANs

0 replies, 9 likes


Joe Davison @ NeurIPS: Very interesting and well-written paper posted just last week. The authors argue that classifiers can be thought of as energy-based models and that by treating them as such, they can jointly learn both generative and discriminative models https://arxiv.org/abs/1912.03263 #machinelearning https://t.co/pdWbineopH

1 replies, 9 likes


HotComputerScience: Most popular computer science paper of the day: "YOUR CLASSIFIER IS SECRETLY AN ENERGY BASED MODEL AND YOU SHOULD TREAT IT LIKE ONE" https://hotcomputerscience.com/paper/your-classifier-is-secretly-an-energy-based-model-and-you-should-treat-it-like-one https://twitter.com/DavidDuvenaud/status/1204143678865866752

0 replies, 4 likes


Erik Nijkamp: Reinterpretation of classifier p(y|x) as EBM over joint p(x,y). Well received at #ICLR2020. Well deserved.(@wgrathwohl, @kcjacksonwang, J. H. Jacobsen, @DavidDuvenaud, @mo_norouzi, @kswersk): https://arxiv.org/abs/1912.03263

0 replies, 3 likes


Jackson Wang: Treating a model right is the least we can do. Checkout how we used the last degree of freedom 😉

0 replies, 2 likes


tsauri: Big labs taking advantage of ICLR's double-blind review system, by posting arxiv version, with header "Under review as a conference paper at ICLR 2020"

0 replies, 1 likes


Chris J. Maddison: @carlesgelada Graphical model ideas in modern DL: https://arxiv.org/abs/1702.00887; Energy-based model ideas in modern DL: https://arxiv.org/abs/1912.03263. Do you think these would have been invented as quickly if the students hadn't at least had a passing familiarity with the history of our field?

0 replies, 1 likes


Alf @ 𝑣ʳIPS: https://twitter.com/DavidDuvenaud/status/1204143678865866752

0 replies, 1 likes


Brinda Thomas: Today I have discovered a new (to me) form of energy.

0 replies, 1 likes


Content

Found on Dec 09 2019 at https://arxiv.org/pdf/1912.03263.pdf

PDF content of a computer science paper: YOUR CLASSIFIER IS SECRETLY AN ENERGY BASED MODEL AND YOU SHOULD TREAT IT LIKE ONE