Papers of the day   All papers



Edward Grefenstette: Happy to announce our paper on Generalized Inner Loop Meta Learning, aka Gimli (, with @brandondamos, @denisyarats, Phu Mon Htut, Artem Molchanov, Franziska Meier, @douwekiela, @kchonyc, and @soumithchintala. THREAD [1/6]

5 replies, 296 likes

Edward Grefenstette: In parallel with this paper, @facebookai has released higher, a library for bypassing limitations to taking higher-order gradients over an optimization process. Library: Docs: Contributions very welcome.

1 replies, 251 likes

higher: ∂Hello/∂World! This is the dev team for higher, a @PyTorch library by @facebookAI which facilitates the implementation of gradient-based meta-learning algorithms. Code: PyPi: Docs:

1 replies, 121 likes

Andrei Bursuc: MAML(s) for the masses: a new pytorch library for implementing existing and developing new meta-learning algos The source papers features a pedagogical description of inner loop meta-learning algos

0 replies, 74 likes

Miles Brundage: "Generalized Inner Loop Meta-Learning," @egrefen et al.:

1 replies, 59 likes

Ethan Rosenthal: Lots of fun things you can do with this, like in the referenced paper ( where the authors let the learning rate be a free parameter and then optimize it.

1 replies, 4 likes

Edward Grefenstette: To enable this work, and other in this area, we released a library (higher) and described the general form of such meta-learning approaches with colleagues at @facebookai (9/16)

1 replies, 3 likes


Found on Oct 07 2019 at

PDF content of a computer science paper: GENERALIZED INNER LOOP META-LEARNING