Papers of the day   All papers

Weight Agnostic Neural Networks

Comments

Jun 12 2019 hardmaru

Weight Agnostic Neural Networks 🦎 Inspired by precocial species in biology, we set out to search for neural net architectures that can already (sort of) perform various tasks even when they use random weight values. Article: https://weightagnostic.github.io PDF: https://arxiv.org/abs/1906.04358 https://t.co/El2uzgxS5I
61 replies, 2351 likes


Sep 06 2019 hardmaru

“Weight Agnostic Neural Networks” has been accepted as a spotlight presentation at #NeurIPS2019! Our proposed method achieved an eye-popping accuracy of 94% on MNIST, significantly underperforming the state-of-the-art 🔥🔥🔥 Updated paper → https://arxiv.org/abs/1906.04358 https://t.co/67butfWh7h
19 replies, 340 likes


Aug 06 2019 Daniel Roy

I really really like this work. It poses so many interesting theoretical questions.
4 replies, 249 likes


Jun 19 2019 Janislav Jankov

My notes on "Weight Agnostic Neural Networks" by Adam Gaier and David Ha https://arxiv.org/abs/1906.04358 This paper was such a breeze to read! As expected by @hardmaru. We know the network’s architecture plays significant role in its ability to solve a problem. But how much?1/9
2 replies, 225 likes


Jun 12 2019 Kyle McDonald

i love david's work because he doesn't think one step ahead, he thinks one step to the side. other researchers spending their precious time on backprop? just pick random weights and see what's possible!
1 replies, 216 likes


Aug 28 2019 Vivek Das

This is becoming sorcery. What did I just read & see. I still need to wrap it in depth. Going weight agnostic but still ranking & learning based on complexity. 😳😱 https://arxiv.org/abs/1906.04358 Google AI Blog: Exploring Weight Agnostic Neural Networks http://ai.googleblog.com/2019/08/exploring-weight-agnostic-neural.html
8 replies, 209 likes


Jun 12 2019 Emtiyaz Khan

An out of the box idea by @hardmaru and team! For a Bayesian, this is like choosing an architecture such the posterior distribution is uniform (contains no information at all)!!
3 replies, 122 likes


Jun 12 2019 Brandon Rohrer

The original promise of neural networks was that a single architecture could learn anything by varying its weights. This work shows that NNs can learn nearly as well by keeping constant weights and varying the architecture. Structure matters more than originally believed.
2 replies, 100 likes


Jun 12 2019 Julian Togelius

Very cool work!
1 replies, 38 likes


Aug 09 2019 George A Constantinides

This is very nice work, and what a wonderfully interactive way to write it up!
1 replies, 35 likes


Jun 12 2019 Marco Salvi

Impressive and fascinating work!
1 replies, 34 likes


Jun 12 2019 Suzana Ilić

Wow. 🤯
1 replies, 32 likes


Jun 15 2019 Albert Cardona

When neural circuit architecture matters more than synaptic weights: “Weight agnostic neural networks”, Gaier & Ha, 2019 https://arxiv.org/abs/1906.04358
2 replies, 31 likes


Jun 12 2019 Pablo Samuel Castro

very cool and neat idea, and the presentation of the article on the github site is just fantastic. @hardmaru i wonder how this type of idea would work on value-based methods, where your network is encoding expected value (or distribution over values) for states instead of policy?
1 replies, 26 likes


Jun 20 2019 Heiga Zen (全 炳河)

When I first heard the idea from David, I was really impressed and keen to know how it would perform. I am lucky that his desk at Google is next to mine :-)
0 replies, 19 likes


Jun 16 2019 Arunava

.@hardmaru recently published Weight Agnostic Neural Networks. The paper shows the importance of the Architecture itself and gets more than 90% acc on MNIST test set with random weights :) Thanks David :) Paper: https://arxiv.org/abs/1906.04358 #MachineLearning #DeepLearning #AI https://t.co/CCfJZyzk7N
0 replies, 17 likes


Sep 06 2019 Erwin Coumans

Congrats Adam Gaier and David @hardmaru Ha! Exciting results and a new experimentation framework (also for learning 3D locomotion) by Brain Tokyo! This is not a 'paper weight'.
0 replies, 16 likes


Jun 17 2019 Kevin Mitchell

@albertcardona So interesting! Instincts and abilities can get wired into the architecture of neural circuits so as to function w/o learning by the individual. (In animals, this is because evolution has done the deepest of deep learning already...) cc @hardmaru @GaryMarcus
2 replies, 14 likes


Aug 22 2019 Andres Torrubia

@TonyZador @NPCollapse I loved the paper. You may already know about weight-agnostic ANNs, which may be "encoded" in the genetic bottleneck by the equivalent of a random seed (?) https://arxiv.org/abs/1906.04358.
1 replies, 14 likes


Jun 12 2019 samim

Fascinating work!
1 replies, 14 likes


Jul 17 2019 Alexandros Goulas

Artificial neuronal networks showcasing the importance of network topology | optimizing topology only - not weights - is sufficient to generate meaningful behavior | https://arxiv.org/abs/1906.04358 https://t.co/KNhLGBdzMs
0 replies, 13 likes


Jun 12 2019 Joshua Achiam

Amazing work, really lovely concept.
1 replies, 11 likes


Aug 22 2019 Kevin Mitchell

@PabloRedux @MelMitchell1 @svalver @sd_marlow @GaryMarcus @TonyZador Very relevant paper (on idea of selecting neural architectures for types of learning - i.e., meta-learning through evolution): Weight Agnostic Neural Networks https://arxiv.org/abs/1906.04358
1 replies, 9 likes


Sep 06 2019 Suzana Ilić

👏👏👏
0 replies, 9 likes


Aug 06 2019 Kwabena Boahen

Super intriguing use of evolutionary algorithms to search for topologies instead of stochastic gradient descent to search for weights.
0 replies, 7 likes


Jun 17 2019 Albert Cardona

One of the authors of “Weight agnostic neural networks” is on twitter: https://mobile.twitter.com/hardmaru/status/1138600152048910336
1 replies, 6 likes


Jun 12 2019 Giovanni Petrantoni

We are working on a product very much inspired by this and neuroevolution in general. MNIST was a huge challenge for me in terms of CPU optimizations. Given you work at google and google cloud is probably cheap for you :) I wonder, how many VMs you had to throw at it?
0 replies, 6 likes


Jun 14 2019 Namhoon Lee

Seemingly a great work here; it would have been great if SNIP was mentioned in their related work as a pruning method for "untrained, randomly initialized neural networks".
1 replies, 6 likes


Aug 16 2019 Your Personal Neurocrackpot

Broke: Weighted connections & functional topologies Woke: Architectures & strong inductive biases Bespoke: Cybernetic infrastructures that attune systems to contexts and organise percolations of conjugate flows to generate "self-imploding explosions" in combinatorial search 💩
0 replies, 5 likes


Jun 13 2019 Abi Aryan

Excellent paper with a very interesting idea..
1 replies, 5 likes


Jun 12 2019 Rebel Science

Deep learning experts should not claim to be inspired by biology. Unlike DNNs, brains don't process data but changes in the environment, aka transitions. The brain is mainly a massive timing mechanism. It doesn't optimize objective functions: no gradients, backprop or labels.
0 replies, 4 likes


Jun 13 2019 Dileep George

Interesting work on discovering idiosyncratic circuits for specific tasks. As I argue in this talk https://slideslive.com/38909792/building-machines-that-work-like-the-brain, this is the opposite of building general intelligence. Current DL can be thought of as a systematic way to discover idiosyncratic circuits.
1 replies, 4 likes


Jun 18 2019 BioDecoded

Weight Agnostic Neural Networks | arXiv https://arxiv.org/abs/1906.04358 https://weightagnostic.github.io/ #DeepLearning https://t.co/7asitT8aol
0 replies, 4 likes


Jun 12 2019 Statistics Papers

Weight Agnostic Neural Networks. http://arxiv.org/abs/1906.04358
0 replies, 4 likes


Jun 12 2019 Massimo Quadrana

Fix a random shared weight and search for the best architecture... Really fascinating work!
0 replies, 4 likes


Sep 06 2019 Kevin Mitchell

@VenkRamaswamy @KordingLab See also: Weight Agnostic Neural Networks https://arxiv.org/abs/1906.04358 https://t.co/ECAxT9Rj1i
0 replies, 3 likes


Jun 12 2019 Daniel Situnayake

This is so cool! Shifting complexity from weights to architecture, so a model can (kinda) work with random weights and can be trained from that starting point. I love that it was inspired by biology:
0 replies, 3 likes


Jun 17 2019 Sandeep Kishore

This is so true especially of motor networks. Good to go, relatively speaking, from the start. Interesting thread based on "weight agnostic neural networks" from @hardmaru https://arxiv.org/abs/1906.04358
0 replies, 3 likes


Jun 12 2019 Pablo Cordero

Reminds me of liquid state machines and echo state networks, essentially finding good kernels to bootstrap from. @hardmaru 's interactive article is, as usual, on a class of it's own, the interactive demos are superb.
0 replies, 3 likes


Jun 13 2019 OGAWA, Tadashi

=> Weight Agnostic Neural Networks, Google, arXiv, Jun 11, 2019 https://arxiv.org/abs/1906.04358 Neural network architectures that can already perform a task without any explicit weight training On supervised learning domain https://weightagnostic.github.io/ Interactive Demo https://github.com/weightagnostic/weightagnostic.github.io https://t.co/finRYC4iyo
1 replies, 2 likes


Jun 12 2019 Claus Aranha

Why worry about the little weights, when we can focus on the network structure instead? Fantastic work by @hardmaru, and a very easy to read page/paper. I love the idea of ensembles of untrained weights.
0 replies, 2 likes


Aug 28 2019 Dan Buscombe

There are now two ways to use artificial neural networks to solve data-driven problems: 1. Design a model architecture and have the computer learn optimal weights (trad), or now 2. Have the computer learn how to build the architecture from the bricks you feed it.
1 replies, 2 likes


Sep 06 2019 OGAWA, Tadashi

=> "Weight Agnostic Neural Networks", Google Brain, arXiv, Sep 5, 2019 (v2) https://arxiv.org/abs/1906.04358 How important are the weight parameters of a NN? https://twitter.com/ogawa_tter/status/1139170853940150272 "has been accepted as a spotlight presentation at NeurIPS2019", Sep 6, 2019 https://twitter.com/hardmaru/status/1169788797262782464
1 replies, 1 likes


Jun 12 2019 Selim

🤔
1 replies, 1 likes


Jun 12 2019 Lana Sinapayen

Great idea here: focusing on architecture rather than weights. A bit like the innate/learned spectrum
0 replies, 1 likes


Jul 16 2019 bitcraft lab

Interesting article on Weight Agonostic Neural Networks by @hardmaru >>> https://twitter.com/hardmaru/status/1138600152048910336 … <<< makes me want to revisit Stuart Kauffman's Random Boolean Networks and look into Boolean Neural Networks :) #selforganisation #neuralnet #RBN #BNN https://t.co/WW2TtucW0l
0 replies, 1 likes


Aug 28 2019 Susheel Busi

@claczny
0 replies, 1 likes


Aug 07 2019 Bhav Ashok

Architecture: Brain as Weights: Learned knowledge, the structure compensates for what the environment/supervision/optimization lacks
0 replies, 1 likes


Jun 13 2019 Hamlet 🇩🇴

"This idea came out after a few drinks in Roppongi." The Unreasonable Effectiveness of Alcohol 🥃 👇👇👇🤯
0 replies, 1 likes


Jun 12 2019 Pierre Richemond

Some try to see the future, others embody it :) Congrats Adam and David !
1 replies, 0 likes


Jun 12 2019 Brundage Bot

Weight Agnostic Neural Networks. Adam Gaier and David Ha http://arxiv.org/abs/1906.04358
1 replies, 0 likes


Content