Papers of the day   All papers

Learning Representations by Maximizing Mutual Information Across Views


Jul 09 2019 Devon Hjelm

Latest version of #AMDIM, a self-supervised method that gets 68% unsupervised Imagenet+linear probe classification, far outstripping prior and recent results by 7%+ with a fraction of the compute Code: by @philip_bachman and @wbuchw
2 replies, 242 likes

Jul 15 2019 David Krueger

If you're in ML, you've probably heard about BigBiGAN because of the DM PR machine. But you may not have heard about this paper by @philip_bachman et al. that came 4 days later and crushes their results.
4 replies, 183 likes

Jun 04 2019 Devon Hjelm

Our work extending Deep InfoMax by maximizing MI across views is out! We achieve SOTA on several unsupervised + linear probe benchmarks, including impressive results on Imagenet with a fraction of the computation of competitors
0 replies, 68 likes

Sep 03 2019 Devon Hjelm

Our paper on augmented multiscale DIM (AMDIM), the self-supervised model that is SOTA on unsupervised imagenet + supervised linear probe by 7%, was accepted as a poster at NeurIPS Lead by @philip_bachman with Will Buchwalter
1 replies, 27 likes

Jul 17 2019 Daisuke Okanohara

They improved self-supervised representation learning on local Deep InfoMax significantly by using 1) independently-augmented versions of each input 2) multiple scales simultaneously 3) powerful encoder with controlled receptive filed
0 replies, 14 likes

Jul 16 2019 Christian Szegedy

Amazing new self-supervised training method for visual models.
0 replies, 12 likes

Dec 10 2019 Devon Hjelm

@ylecun Don't forget AMDIM, which achieved over 68% in Imagenet pertraining + linear probing 6 months ago and will be at @NeurIPSConf Thursday East Exhibition hall B+C #32!
0 replies, 10 likes

Jul 26 2019 William Buchwalter

Pre-trained models for AMDIM ( are now available:
0 replies, 9 likes

Jul 28 2019 Adam Trischler

If you'd like to play with some powerful, pre-trained image representations (learned via self-supervised methods and computationally reasonable), check these out!
0 replies, 8 likes

Jul 10 2019 Ankesh Anand

Exciting progress in unsupervised representation learning! Seems like the pendulum has swung towards contrastive methods again.
1 replies, 7 likes

Jul 16 2019 IntuitionMachine

Self-supervised learning by maximizing mutual information between arbitrary features extracted from multiple views of a shared context. #deeplearning #selfsupervised
0 replies, 3 likes