Papers of the day   All papers

Modular Meta-Learning with Shrinkage


Sep 13 2019 Nando de Freitas

I recommend this paper with theoretical and algorithmic insights on metalearning to researchers interested in hierarchical Bayes, MAML, and Reptile. It addresses the idea of learning reusable fixed and adaptive modules across many tasks. ⁦⁦
0 replies, 495 likes

Sep 13 2019 Yutian Chen

Our method learns the flexibility of each module for task adaptation using the Shrinkage prior. We're particularly interested in the data efficiency rather than adaptation runtime.
0 replies, 32 likes

Sep 19 2019 Nando de Freitas

@fhuszar And for a Bayesian generalisation of Reptile, modular MAML, implicit gradients and cool theory by @yutianc I recommend this: .
0 replies, 19 likes

Sep 20 2019 Nando de Freitas

@svlevine @fhuszar Our new paper, , has an interpretation (shrinkage) that immediately generalises existing algorithms. It also lays the theory for this and the necessity of validation data even In Bayesian models. Also, modular MAML is a previous great idea! 2/2.
0 replies, 14 likes

Sep 13 2019 Misha Denil

This is an insightful paper on modularity and consistency in meta-learning.
1 replies, 8 likes

Sep 14 2019 Daisuke Okanohara

Meta-learning requires inner-loop optimization for each task, and implicit differentiation can the gradient directly from the solution. (iMAML) Similar idea is also proposed in hierarchical Bayesian meta-learning setting.
0 replies, 8 likes

Jan 14 2020 HotComputerScience

Most popular computer science paper of the day: "Modular Meta-Learning with Shrinkage"
0 replies, 4 likes

Sep 13 2019 Brundage Bot

Modular Meta-Learning with Shrinkage. Yutian Chen, Abram L. Friesen, Feryal Behbahani, David Budden, Matthew W. Hoffman, Arnaud Doucet, and Nando de Freitas
1 replies, 1 likes