Papers of the day   All papers



DeepMind: Our new work on memory uses a neural network's weights as fast and compressive associative storage. Reading from the memory is performed by approximate minimisation of the energy modelled by the network.

6 replies, 984 likes

Sergey Bartunov: Excited to share some recent work! If you are bored of stacked vectors as a memory model, look here

0 replies, 94 likes

Simon Osindero: Excited to share some new work in collaboration with @sbos, Jack Rae, and Tim Lillicrap -- demonstrating a new method for using a neural network's weights for fast & compressive memory storage.

1 replies, 23 likes

Thomas Miconi: This is so ridiculously clever!

1 replies, 22 likes

Simon Osindero: Also happy to share that another #ICLR2020 submission was accepted -- our work on "Meta-Learning Deep Energy-Based Memory Models" with @sbos, Jack Wrae, and Tim Lillicrap.

0 replies, 9 likes

Tiago Ramalho: Just read this paper, good work @sbos! We haven't been paying enough attention to quick associative learning. This is an interesting first step.

0 replies, 2 likes

4InData: #4InData #MetaLearning #DeepEnergy #MachineLearning #NeuralNetworks Meta-Learning Deep Energy-Based Memory Models

0 replies, 1 likes

Nick Fisher: Intriguing work from DeepMind on meta-learning with energy-based memory models

0 replies, 1 likes


Found on Oct 21 2019 at

PDF content of a computer science paper: META-LEARNING DEEP ENERGY-BASED MEMORY MODELS