Papers of the day   All papers

Weight Agnostic Neural Networks

Comments

hardmaru: Weight Agnostic Neural Networks ๐ŸฆŽ Inspired by precocial species in biology, we set out to search for neural net architectures that can already (sort of) perform various tasks even when they use random weight values. Article: https://weightagnostic.github.io PDF: https://arxiv.org/abs/1906.04358 https://t.co/El2uzgxS5I

62 replies, 2345 likes


hardmaru: โ€œWeight Agnostic Neural Networksโ€ has been accepted as a spotlight presentation at #NeurIPS2019! Our proposed method achieved an eye-popping accuracy of 94% on MNIST, significantly underperforming the state-of-the-art ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ Updated paper โ†’ https://arxiv.org/abs/1906.04358 https://t.co/67butfWh7h

20 replies, 354 likes


Daniel Roy: I really really like this work. It poses so many interesting theoretical questions.

4 replies, 249 likes


Janislav Jankov: My notes on "Weight Agnostic Neural Networks" by Adam Gaier and David Ha https://arxiv.org/abs/1906.04358 This paper was such a breeze to read! As expected by @hardmaru. We know the networkโ€™s architecture plays significant role in its ability to solve a problem. But how much?1/9

2 replies, 225 likes


Kyle McDonald: i love david's work because he doesn't think one step ahead, he thinks one step to the side. other researchers spending their precious time on backprop? just pick random weights and see what's possible!

1 replies, 216 likes


Vivek Das: This is becoming sorcery. What did I just read & see. I still need to wrap it in depth. Going weight agnostic but still ranking & learning based on complexity. ๐Ÿ˜ณ๐Ÿ˜ฑ https://arxiv.org/abs/1906.04358 Google AI Blog: Exploring Weight Agnostic Neural Networks http://ai.googleblog.com/2019/08/exploring-weight-agnostic-neural.html

8 replies, 209 likes


hardmaru: If you are at #NeurIPS2019, pls swing by to chat about weight agnostic neural networks this morning! ๐Ÿง  10:45 AMโ€”12:45 PM @ East Exhibition Hall B + C #149 http://weightagnostic.github.io https://t.co/FsopsrfXxk

3 replies, 196 likes


Emtiyaz Khan: An out of the box idea by @hardmaru and team! For a Bayesian, this is like choosing an architecture such the posterior distribution is uniform (contains no information at all)!!

3 replies, 122 likes


Brandon Rohrer: The original promise of neural networks was that a single architecture could learn anything by varying its weights. This work shows that NNs can learn nearly as well by keeping constant weights and varying the architecture. Structure matters more than originally believed.

2 replies, 100 likes


hardmaru: @dennybritz We had a paper accepted at NeurIPS2019 as a spotlight, even though the method (even after extensive hyperparameter tuning) only achieved 94% on MNIST https://twitter.com/hardmaru/status/1169788797262782464

1 replies, 87 likes


Julian Togelius: Very cool work!

1 replies, 38 likes


George A Constantinides: This is very nice work, and what a wonderfully interactive way to write it up!

1 replies, 35 likes


Marco Salvi: Impressive and fascinating work!

1 replies, 34 likes


Suzana Iliฤ‡: Wow. ๐Ÿคฏ

1 replies, 32 likes


Albert Cardona: When neural circuit architecture matters more than synaptic weights: โ€œWeight agnostic neural networksโ€, Gaier & Ha, 2019 https://arxiv.org/abs/1906.04358

2 replies, 31 likes


Pablo Samuel Castro: very cool and neat idea, and the presentation of the article on the github site is just fantastic. @hardmaru i wonder how this type of idea would work on value-based methods, where your network is encoding expected value (or distribution over values) for states instead of policy?

1 replies, 26 likes


Heiga Zen (ๅ…จ ็‚ณๆฒณ): When I first heard the idea from David, I was really impressed and keen to know how it would perform. I am lucky that his desk at Google is next to mine :-)

0 replies, 19 likes


Arunava: .@hardmaru recently published Weight Agnostic Neural Networks. The paper shows the importance of the Architecture itself and gets more than 90% acc on MNIST test set with random weights :) Thanks David :) Paper: https://arxiv.org/abs/1906.04358 #MachineLearning #DeepLearning #AI https://t.co/CCfJZyzk7N

0 replies, 17 likes


Erwin Coumans: Congrats Adam Gaier and David @hardmaru Ha! Exciting results and a new experimentation framework (also for learning 3D locomotion) by Brain Tokyo! This is not a 'paper weight'.

0 replies, 16 likes


Kevin Mitchell: @albertcardona So interesting! Instincts and abilities can get wired into the architecture of neural circuits so as to function w/o learning by the individual. (In animals, this is because evolution has done the deepest of deep learning already...) cc @hardmaru @GaryMarcus

2 replies, 14 likes


Andres Torrubia: @TonyZador @NPCollapse I loved the paper. You may already know about weight-agnostic ANNs, which may be "encoded" in the genetic bottleneck by the equivalent of a random seed (?) https://arxiv.org/abs/1906.04358.

1 replies, 14 likes


samim: Fascinating work!

1 replies, 14 likes


Alexandros Goulas: Artificial neuronal networks showcasing the importance of network topology | optimizing topology only - not weights - is sufficient to generate meaningful behavior | https://arxiv.org/abs/1906.04358 https://t.co/KNhLGBdzMs

0 replies, 13 likes


Joshua Achiam: Amazing work, really lovely concept.

1 replies, 11 likes


Kevin Mitchell: @PabloRedux @MelMitchell1 @svalver @sd_marlow @GaryMarcus @TonyZador Very relevant paper (on idea of selecting neural architectures for types of learning - i.e., meta-learning through evolution): Weight Agnostic Neural Networks https://arxiv.org/abs/1906.04358

1 replies, 9 likes


Suzana Iliฤ‡: ๐Ÿ‘๐Ÿ‘๐Ÿ‘

0 replies, 9 likes


Kwabena Boahen: Super intriguing use of evolutionary algorithms to search for topologies instead of stochastic gradient descent to search for weights.

0 replies, 7 likes


Namhoon Lee: Seemingly a great work here; it would have been great if SNIP was mentioned in their related work as a pruning method for "untrained, randomly initialized neural networks".

1 replies, 6 likes


Albert Cardona: One of the authors of โ€œWeight agnostic neural networksโ€ is on twitter: https://mobile.twitter.com/hardmaru/status/1138600152048910336

1 replies, 6 likes


Giovanni Petrantoni: We are working on a product very much inspired by this and neuroevolution in general. MNIST was a huge challenge for me in terms of CPU optimizations. Given you work at google and google cloud is probably cheap for you :) I wonder, how many VMs you had to throw at it?

0 replies, 6 likes


Your Personal Neurocrackpot: Broke: Weighted connections & functional topologies Woke: Architectures & strong inductive biases Bespoke: Cybernetic infrastructures that attune systems to contexts and organise percolations of conjugate flows to generate "self-imploding explosions" in combinatorial search ๐Ÿ’ฉ

0 replies, 5 likes


Abi Aryan: Excellent paper with a very interesting idea..

1 replies, 5 likes


Statistics Papers: Weight Agnostic Neural Networks. http://arxiv.org/abs/1906.04358

0 replies, 4 likes


Rebel Science: Deep learning experts should not claim to be inspired by biology. Unlike DNNs, brains don't process data but changes in the environment, aka transitions. The brain is mainly a massive timing mechanism. It doesn't optimize objective functions: no gradients, backprop or labels.

0 replies, 4 likes


BioDecoded: Weight Agnostic Neural Networks | arXiv https://arxiv.org/abs/1906.04358 https://weightagnostic.github.io/ #DeepLearning https://t.co/7asitT8aol

0 replies, 4 likes


Dileep George: Interesting work on discovering idiosyncratic circuits for specific tasks. As I argue in this talk https://slideslive.com/38909792/building-machines-that-work-like-the-brain, this is the opposite of building general intelligence. Current DL can be thought of as a systematic way to discover idiosyncratic circuits.

1 replies, 4 likes


Massimo Quadrana: Fix a random shared weight and search for the best architecture... Really fascinating work!

0 replies, 4 likes


Pablo Cordero: Reminds me of liquid state machines and echo state networks, essentially finding good kernels to bootstrap from. @hardmaru 's interactive article is, as usual, on a class of it's own, the interactive demos are superb.

0 replies, 3 likes


Daniel Situnayake: This is so cool! Shifting complexity from weights to architecture, so a model can (kinda) work with random weights and can be trained from that starting point. I love that it was inspired by biology:

0 replies, 3 likes


Kevin Mitchell: @VenkRamaswamy @KordingLab See also: Weight Agnostic Neural Networks https://arxiv.org/abs/1906.04358 https://t.co/ECAxT9Rj1i

0 replies, 3 likes


hardmaru: Adam Gaier will be presenting our paper on โ€œWeight Agnostic Neural Networksโ€ spotlight presentation (Dec 10th 5:25pm) โ†’ https://bit.ly/2OQIEpV poster session (Dec 11th) โ†’ https://bit.ly/2LqntsF paper โ†’ https://bit.ly/38ckkXs tweetstorm โ†“ https://twitter.com/hardmaru/status/1138600152048910336

0 replies, 3 likes


Sandeep Kishore: This is so true especially of motor networks. Good to go, relatively speaking, from the start. Interesting thread based on "weight agnostic neural networks" from @hardmaru https://arxiv.org/abs/1906.04358

0 replies, 3 likes


Claus Aranha: Why worry about the little weights, when we can focus on the network structure instead? Fantastic work by @hardmaru, and a very easy to read page/paper. I love the idea of ensembles of untrained weights.

0 replies, 2 likes


OGAWA, Tadashi: => Weight Agnostic Neural Networks, Google, arXiv, Jun 11, 2019 https://arxiv.org/abs/1906.04358 Neural network architectures that can already perform a task without any explicit weight training On supervised learning domain https://weightagnostic.github.io/ Interactive Demo https://github.com/weightagnostic/weightagnostic.github.io https://t.co/finRYC4iyo

1 replies, 2 likes


ๅฐ็Œซ้Šใ‚Šใ‚‡ใ†๏ผˆใŸใ‹ใซใ‚ƒใ—ใƒปใ‚Šใ‚‡ใ†๏ผ‰: Weight Agnostic Neural Networks https://arxiv.org/abs/1906.04358

1 replies, 2 likes


Dan Buscombe: There are now two ways to use artificial neural networks to solve data-driven problems: 1. Design a model architecture and have the computer learn optimal weights (trad), or now 2. Have the computer learn how to build the architecture from the bricks you feed it.

1 replies, 2 likes


OGAWA, Tadashi: => "Weight Agnostic Neural Networks", Google Brain, arXiv, Sep 5, 2019 (v2) https://arxiv.org/abs/1906.04358 How important are the weight parameters of a NN? https://twitter.com/ogawa_tter/status/1139170853940150272 "has been accepted as a spotlight presentation at NeurIPS2019", Sep 6, 2019 https://twitter.com/hardmaru/status/1169788797262782464

1 replies, 1 likes


Hamlet ๐Ÿ‡ฉ๐Ÿ‡ด: "This idea came out after a few drinks in Roppongi." The Unreasonable Effectiveness of Alcohol ๐Ÿฅƒ ๐Ÿ‘‡๐Ÿ‘‡๐Ÿ‘‡๐Ÿคฏ

0 replies, 1 likes


Susheel Busi: @claczny

0 replies, 1 likes


bitcraft lab: Interesting article on Weight Agonostic Neural Networks by @hardmaru >>> https://twitter.com/hardmaru/status/1138600152048910336 โ€ฆ <<< makes me want to revisit Stuart Kauffman's Random Boolean Networks and look into Boolean Neural Networks :) #selforganisation #neuralnet #RBN #BNN https://t.co/WW2TtucW0l

0 replies, 1 likes


Lana Sinapayen: Great idea here: focusing on architecture rather than weights. A bit like the innate/learned spectrum

0 replies, 1 likes


Selim: ๐Ÿค”

1 replies, 1 likes


Bhav Ashok: Architecture: Brain as Weights: Learned knowledge, the structure compensates for what the environment/supervision/optimization lacks

0 replies, 1 likes


Pierre Richemond: Some try to see the future, others embody it :) Congrats Adam and David !

1 replies, 0 likes


gonzo_ML: ๐—ช๐—ฒ๐—ถ๐—ด๐—ต๐˜ ๐—”๐—ด๐—ป๐—ผ๐˜€๐˜๐—ถ๐—ฐ ๐—ก๐—ฒ๐˜‚๐—ฟ๐—ฎ๐—น ๐—ก๐—ฒ๐˜๐˜„๐—ผ๐—ฟ๐—ธ๐˜€ Authors: Adam Gaier (@adam_gaier), David Ha (@hardmaru) Article: https://arxiv.org/abs/1906.04358 Interactive article: https://weightagnostic.github.io/ Code: https://github.com/weightagnostic/weightagnostic.github.io https://t.co/nRO4f7XwMm

1 replies, 0 likes


Brundage Bot: Weight Agnostic Neural Networks. Adam Gaier and David Ha http://arxiv.org/abs/1906.04358

1 replies, 0 likes


Content

Found on Jun 12 2019 at https://arxiv.org/pdf/1906.04358.pdf

PDF content of a computer science paper: Weight Agnostic Neural Networks