Papers of the day   All papers

Hopfield Networks is All You Need

Comments

hardmaru: Self-attention mechanism can be viewed as the update rule of a Hopfield network with continuous states. Deep learning models can take advantage of Hopfield networks as a powerful concept comprising pooling, memory, and attention. https://arxiv.org/abs/2008.02217 https://github.com/ml-jku/hopfield-layers https://t.co/Ld2eioVsDG

22 replies, 1568 likes


Aran Komatsuzaki: Hopfield Networks is All You Need Shows that attention mechanism of transformers is equivalent to the update rule of a modern Hopfield network with continuous states. https://arxiv.org/abs/2008.02217

10 replies, 613 likes


hardmaru: Transformer’s attention mechanism can be linked to other cool ideas in AI - Indirect Encoding in Neuroevolution https://attentionagent.github.io - Hopfield Networks with continuous states https://arxiv.org/abs/2008.02217 - Graph Neural Networks with multi-head attention https://thegradient.pub/transformers-are-graph-neural-networks/

5 replies, 440 likes


mat kelcey: hopfield nets!!!?! elvis playing tonight in vegas wouldn't be considered as big a comeback as this one!!

2 replies, 211 likes


IARAI: Attention mechanism of transformers is equivalent to the update rule of a modern Hopfield network with continuous states! Proud to announce the latest groundbreaking paper by Sepp Hochreiter team and our #IARAI colleagues! 👉https://arxiv.org/abs/2008.02217 #deeplearning #ai @LITAILab https://t.co/PwqZnBxv4E

2 replies, 111 likes


AK: Hopfield Networks is All You Need pdf: https://arxiv.org/pdf/2008.02217.pdf abs: https://arxiv.org/abs/2008.02217 github: https://github.com/ml-jku/hopfield-layers https://t.co/0VmtHZK9QX

0 replies, 104 likes


Christian Szegedy: I used to think I understood transformers... https://t.co/vlxdviHjdC

2 replies, 80 likes


Tiago Ramalho: Great paper but the elephant in the room is... Shouldn't it be "Hopfield Networks *are* All You Need"? https://arxiv.org/abs/2008.02217

3 replies, 52 likes


LIT AI Lab & ELLIS Unit Linz: Transformer attention is the update of a modern Hopfield net with continuous states. My colleagues are very proud to have discovered a new energy function and an update rule (actually just the softmax), which converges after one update. @jbrandi6 @MichaelWidrich @HRamses2

1 replies, 30 likes


e-Katerina Vylomova: Quiz: When Geoffrey Hinton described Hopfield networks (in one well-known online course), he mentioned their ability to make correct predictions even when significant part of the input signal is lost, and compared them to one known Frenchman. Who was that Frenchman?(no Googling)

3 replies, 24 likes


Frederik Kratzert: Exciting research by some of my colleagues at @LITAILab on how Transformers (like GPT-3) are just a special case of modern Hopfield networks. This adds a huge body of theoretical background to Transf., which will hopefully help to better understand why they are so successful.

2 replies, 20 likes


Mario Figueiredo: Hopfield networks strike again.

0 replies, 19 likes


Marius-Constantin Dinu: Hopfield Networks is All You Need https://arxiv.org/abs/2008.02217 https://github.com/ml-jku/hopfield-layers #MachineLearning #NeuralNetworks https://t.co/l780xMSaTS

1 replies, 17 likes


Jane Wang: 3. Hopfield Networks is All You Need https://arxiv.org/pdf/2008.02217.pdf

1 replies, 14 likes


Charles 🎉 Frye: Interesting result, but missed opportunity to call it "Transformers: Hopfields in Disguise" 🤖

0 replies, 10 likes


Vihang Patil: Super cool work by my colleagues (@HRamses2 , @MichaelWidrich , @jbrandi6, @negentrop92 & others)! Hopfield Networks is all you need, introduced a new energy function with a new update rule, which is the attention mechanism!

0 replies, 9 likes


kaalam.ai: Great paper with code!!

0 replies, 7 likes


Andrew Gambardella@Machine Learner: Hopfield Networks is All You Need. This paper seems to give an interesting strategy for improving Transformers and people on Reddit are saying it works well out of the box. Like most Sepp Hochreiter papers, the appendix is 70 pages of Greek letters https://arxiv.org/abs/2008.02217

0 replies, 6 likes


Sai Krishna G.V.: "Hopfield Networks is All You Need" https://arxiv.org/abs/2008.02217 cool stuff!

0 replies, 6 likes


Eric Leonardis: Hopfield Networks is All You Need (Ramsauer et al, 2012) Transformer and BERT models pushed the performance to new levels via an attention mechanism. "We show that this attention mechanism is the update rule of a Hopfield network with continuous states." https://arxiv.org/pdf/2008.02217.pdf

1 replies, 6 likes


Kaj Sotala: https://arxiv.org/abs/2008.02217 feels interesting based on the abstract. Haven't yet dug into how transformers (such as GPT) work, but I've studied Hopfield nets a bit - they're pretty simple to understand. If this is correct, gives me some intuition of what's going on with transformers. https://t.co/k8dWACGmA0

1 replies, 5 likes


Timothy O'Hear: This seems like it might be very significant, but I'm not finding it easy to wrap my head around. If only there was somebody out there that could explain it in a clear and entertaining way ... @ykilcher

2 replies, 5 likes


Carlos E. Perez: More reading today! Hopfield networks is All you Need! https://arxiv.org/abs/2008.02217

1 replies, 5 likes


Christian Wolf: Transformers are Hopfield networks with continuous states. A 10 page paper with a 76 page paper appendix. https://arxiv.org/abs/2008.02217 https://t.co/EhZYIS5qEc

0 replies, 5 likes


Nicholas Vadivelu: Great video explaining the Hopfield Networks is All You Need paper (https://arxiv.org/abs/2008.02217)! Still wish the paper was called Hopfield Networks: Transformers in Disguise 😅

1 replies, 4 likes


LIT AI Lab & ELLIS Unit Linz: Glad to announce that our companion papers "Hopfield Networks is All You Need" and "Modern Hopfield Networks and Attention for Immune Repertoire Classification" made it to arXiv! https://arxiv.org/abs/2008.02217 https://arxiv.org/abs/2007.13505

0 replies, 4 likes


Marek Bardoński: If you have trouble understanding why Hopfield Networks are all you need make sure to watch this thorough video about them. Video: https://www.youtube.com/watch?v=nv6oFDp6rNQ&feature=youtu.be arXiv paper: https://arxiv.org/abs/2008.02217

0 replies, 4 likes


Biswajit Paul: “Binary Hopfield networks seem to be an ancient technique, however, new energy functions improved the properties of Hopfield networks.” Whoa, talk about resurrection! 🔥

1 replies, 3 likes


Alison B. Lowndes ✿ #BlackisKing: Profound work by Sepp Hochreiter (Herr LSTM) & the @LITAILab @IARAInews @unioslo_bioinfo et al + a typically extensive Appendix!

0 replies, 3 likes


HotComputerScience: Most popular computer science paper of the day: "Hopfield Networks is All You Need" https://hotcomputerscience.com/paper/hopfield-networks-is-all-you-need https://twitter.com/hardmaru/status/1291250453263441922

0 replies, 3 likes


Peter Rupprecht: Using Hopfield networks to reframe the transformer/attention mechanisms used for language models. Interesting for anybody familiar with Hopfield and interested in modern machine learning, in particular figure 2 ... https://arxiv.org/abs/2008.02217 Nicely discussed by @ykilcher.

1 replies, 2 likes


Tarun: Gotta read this. Studied Hopfield Networks at an undergrad NN class during the CNNs are rage era and I was like does this stuff matter anymore? Apparently it does 😆

0 replies, 2 likes


mikiobraun: Interesting!

0 replies, 2 likes


The Old Man In The Cave (speaking for myself only): @IntuitMachine I just saw it as a glorified Hopfield memory where n parameters store sqrt(n) concepts. Sepp Hochreiter's group did not disappoint here. But I guess #GPT3 is the closest we get to building #DeepThought before the #ELE resolves the #FermIParadox for us. https://arxiv.org/abs/2008.02217

0 replies, 2 likes


Rish 🤖: so im hearing hopfield networks are trendy again check it out: https://arxiv.org/pdf/2008.02217.pdf

1 replies, 2 likes


smdrnks: Intriguing connection: the attention mechanism of transformers is equivalent to the update rule of a modern Hopfield network (with continuous states). https://arxiv.org/abs/2008.02217

1 replies, 1 likes


PsychoSwimDad: Wow... I used a Hopfield Network to optimize plant layout 30 years ago in my thesis. Glad to see it is used nowdays too.

0 replies, 1 likes


Andre: Hopfield Networks is All You Need https://arxiv.org/abs/2008.02217

0 replies, 1 likes


arXiv CS-CL: Hopfield Networks is All You Need http://arxiv.org/abs/2008.02217

0 replies, 1 likes


Ste𝔣an 🖥️🎧⚡: Hopfield Networks is All You Need: "We show that the transformer attention mechanism is the update rule of a modern Hopfield network with continuous states" 🤯 -> https://arxiv.org/abs/2008.02217

0 replies, 1 likes


LIT AI Lab & ELLIS Unit Linz: Our latest research paper @JKU gets viral! "Super interesting discussions there", Yannic Kilcher ETH Zürich. #ai #transformer #attention Hopfield Networks is All You Need (Paper Explained) https://www.youtube.com/watch?v=nv6oFDp6rNQ&feature=youtu.be , Paper: https://arxiv.org/abs/2008.02217

0 replies, 1 likes


andrea panizza: Paper from 3 teams (incl. Sepp Hochreiter's) doing the rounds! They interpret Transformers as an continuous state extension of Hopfield networks. Lots of interesting insights abt learning process. It's an interesting step towards a theory of Transformers! https://arxiv.org/abs/2008.02217

0 replies, 1 likes


Content

Found on Aug 06 2020 at https://arxiv.org/pdf/2008.02217.pdf

PDF content of a computer science paper: Hopfield Networks is All You Need