Papers of the day   All papers

META-LEARNING DEEP ENERGY-BASED MEMORY MODELS

Comments

DeepMind: Our new work on memory uses a neural network's weights as fast and compressive associative storage. Reading from the memory is performed by approximate minimisation of the energy modelled by the network. https://arxiv.org/abs/1910.02720 https://t.co/6uPAD9asgv

6 replies, 984 likes


Sergey Bartunov: Excited to share some recent work! If you are bored of stacked vectors as a memory model, look here

0 replies, 94 likes


Simon Osindero: Excited to share some new work in collaboration with @sbos, Jack Rae, and Tim Lillicrap -- demonstrating a new method for using a neural network's weights for fast & compressive memory storage.

1 replies, 23 likes


Thomas Miconi: This is so ridiculously clever!

1 replies, 22 likes


Simon Osindero: Also happy to share that another #ICLR2020 submission was accepted -- our work on "Meta-Learning Deep Energy-Based Memory Models" with @sbos, Jack Wrae, and Tim Lillicrap.

0 replies, 9 likes


Tiago Ramalho: Just read this paper, good work @sbos! We haven't been paying enough attention to quick associative learning. This is an interesting first step.

0 replies, 2 likes


4InData: #4InData #MetaLearning #DeepEnergy #MachineLearning #NeuralNetworks Meta-Learning Deep Energy-Based Memory Models https://arxiv.org/abs/1910.02720?fbclid=IwAR24PgrEpgASjEQVI2dCADU8srhDIICHgXxt_feugYxBP551KaGNgOCw82o https://t.co/ityIWcHD6D

0 replies, 1 likes


Nick Fisher: Intriguing work from DeepMind on meta-learning with energy-based memory models https://arxiv.org/pdf/1910.02720.pdf

0 replies, 1 likes


Content

Found on Oct 21 2019 at https://arxiv.org/pdf/1910.02720.pdf

PDF content of a computer science paper: META-LEARNING DEEP ENERGY-BASED MEMORY MODELS