Papers of the day   All papers

Meta Pseudo Labels


Quoc Le: New paper: Meta Pseudo Labels Self-training has a pre-trained teacher to generate pseudo labels to train a student. Here we use the student’s performance to meta-train the teacher to generate better pseudo labels. Works well on ImageNet 10%. Link:

7 replies, 443 likes

Thang Luong: Nice additional gains achieved by MPL (Meta Pseudo Labels, on top of UDA (Unsupervised Data Augmentation, on low-data regimes!

0 replies, 98 likes

Bojan Tunguz: This work has two of my favorite words in it: *pseudo* AND *meta*.

1 replies, 8 likes

Daisuke Okanohara: In MPL, the teacher NN generates the target distribution during the training course so that the student NN performs well on a validation set, achieving new SOTA on a few-shot and semi-sup learning.

0 replies, 2 likes


Found on Mar 25 2020 at

PDF content of a computer science paper: Meta Pseudo Labels