Oriol Vinyals: Rapid unsupervised learning progress thanks to contrastive losses, approaching supervised learning!
-40% Multitask SSL https://arxiv.org/abs/1708.07860 (2017)
-50% CPC https://arxiv.org/abs/1807.03748 (2018)
-70% AMDIM/MOCO/CPCv2/etc (2019)
-76.5% SimCLR https://arxiv.org/abs/2002.05709 (2020, so far) https://t.co/z1Q1yPi9pO
5 replies, 683 likes
Aäron van den Oord: Our latest work is out!
Representation Learning with Contrastive Predictive Coding (CPC).
Autoregressive modeling meets contrastive losses in the latent space.
Learn useful representations in an unsupervised way.
-> On Audio, Vision, NLP and RL.
Arxiv: https://arxiv.org/abs/1807.03748 https://t.co/3Tnpqt9N0v
7 replies, 605 likes
Aravind Srinivas: Self-supervised learning explosion 💥
1 replies, 60 likes
Phillip Isola: (2/2) Extends / simplifies "Contrastive Predictive Coding" https://arxiv.org/abs/1807.03748
1. More views —> better reps
2. Contrastive learning outperforms predictive
3. On Imagenet, unsupervised Resnet-101 outperforms supervised Alexnet
0 replies, 18 likes
Pierre Richemond: It’s confirmed : the revolution will be self-supervised.
1 replies, 8 likes
Smerity: @ID_AA_Carmack Language in different modalities :) The compute for image/video slowed down the transfer of many of the methods but it's happening. Approaches like "Representation Learning with Contrastive Predictive Coding" (@avdnoord et al) seem a good middle ground.
1 replies, 6 likes
Blake Richards: @WiringTheBrain @so_evolutionary @GaryMarcus @TonyZador Here's a few recent ones if you're interested:
2 replies, 3 likes
Vivek Natarajan: This is really exciting! Vision's BERT moment is closer than ever.
0 replies, 1 likes
Found on Feb 14 2020 at https://arxiv.org/pdf/1807.03748.pdf