Papers of the day   All papers

Meta Pseudo Labels

Comments

Quoc Le: New paper: Meta Pseudo Labels Self-training has a pre-trained teacher to generate pseudo labels to train a student. Here we use the student’s performance to meta-train the teacher to generate better pseudo labels. Works well on ImageNet 10%. Link: https://arxiv.org/abs/2003.10580 https://t.co/FQF6E0Vsda

7 replies, 443 likes


Thang Luong: Nice additional gains achieved by MPL (Meta Pseudo Labels, https://arxiv.org/abs/2003.10580) on top of UDA (Unsupervised Data Augmentation, https://arxiv.org/abs/1904.12848) on low-data regimes! https://t.co/e2HTx4H1K1

0 replies, 98 likes


Bojan Tunguz: This work has two of my favorite words in it: *pseudo* AND *meta*.

1 replies, 8 likes


Daisuke Okanohara: In MPL, the teacher NN generates the target distribution during the training course so that the student NN performs well on a validation set, achieving new SOTA on a few-shot and semi-sup learning. https://arxiv.org/abs/2003.10580

0 replies, 2 likes


Content

Found on Mar 25 2020 at https://arxiv.org/pdf/2003.10580.pdf

PDF content of a computer science paper: Meta Pseudo Labels