Papers of the day   All papers

META-LEARNING DEEP ENERGY-BASED MEMORY MODELS

Comments

Oct 21 2019 DeepMind

Our new work on memory uses a neural network's weights as fast and compressive associative storage. Reading from the memory is performed by approximate minimisation of the energy modelled by the network. https://arxiv.org/abs/1910.02720 https://t.co/6uPAD9asgv
6 replies, 982 likes


Oct 21 2019 Sergey Bartunov

Excited to share some recent work! If you are bored of stacked vectors as a memory model, look here
0 replies, 94 likes


Oct 21 2019 Simon Osindero

Excited to share some new work in collaboration with @sbos, Jack Rae, and Tim Lillicrap -- demonstrating a new method for using a neural network's weights for fast & compressive memory storage.
1 replies, 23 likes


Oct 22 2019 Thomas Miconi

This is so ridiculously clever!
1 replies, 22 likes


Oct 29 2019 Tiago Ramalho

Just read this paper, good work @sbos! We haven't been paying enough attention to quick associative learning. This is an interesting first step.
0 replies, 2 likes


Oct 09 2019 4InData

#4InData #MetaLearning #DeepEnergy #MachineLearning #NeuralNetworks Meta-Learning Deep Energy-Based Memory Models https://arxiv.org/abs/1910.02720?fbclid=IwAR24PgrEpgASjEQVI2dCADU8srhDIICHgXxt_feugYxBP551KaGNgOCw82o https://t.co/ityIWcHD6D
0 replies, 1 likes


Oct 21 2019 Nick Fisher

Intriguing work from DeepMind on meta-learning with energy-based memory models https://arxiv.org/pdf/1910.02720.pdf
0 replies, 1 likes


Content