Papers of the day   All papers

Weight Agnostic Neural Networks


Jun 12 2019 hardmaru

Weight Agnostic Neural Networks 🦎 Inspired by precocial species in biology, we set out to search for neural net architectures that can already (sort of) perform various tasks even when they use random weight values. Article: PDF:
62 replies, 2350 likes

Sep 06 2019 hardmaru

“Weight Agnostic Neural Networks” has been accepted as a spotlight presentation at #NeurIPS2019! Our proposed method achieved an eye-popping accuracy of 94% on MNIST, significantly underperforming the state-of-the-art 🔥🔥🔥 Updated paper →
19 replies, 340 likes

Aug 06 2019 Daniel Roy

I really really like this work. It poses so many interesting theoretical questions.
4 replies, 249 likes

Jun 19 2019 Janislav Jankov

My notes on "Weight Agnostic Neural Networks" by Adam Gaier and David Ha This paper was such a breeze to read! As expected by @hardmaru. We know the network’s architecture plays significant role in its ability to solve a problem. But how much?1/9
2 replies, 225 likes

Jun 12 2019 Kyle McDonald

i love david's work because he doesn't think one step ahead, he thinks one step to the side. other researchers spending their precious time on backprop? just pick random weights and see what's possible!
1 replies, 216 likes

Aug 28 2019 Vivek Das

This is becoming sorcery. What did I just read & see. I still need to wrap it in depth. Going weight agnostic but still ranking & learning based on complexity. 😳😱 Google AI Blog: Exploring Weight Agnostic Neural Networks
8 replies, 209 likes

Dec 11 2019 hardmaru

If you are at #NeurIPS2019, pls swing by to chat about weight agnostic neural networks this morning! 🧠 10:45 AM—12:45 PM @ East Exhibition Hall B + C #149
3 replies, 196 likes

Jun 12 2019 Emtiyaz Khan

An out of the box idea by @hardmaru and team! For a Bayesian, this is like choosing an architecture such the posterior distribution is uniform (contains no information at all)!!
3 replies, 122 likes

Jun 12 2019 Brandon Rohrer

The original promise of neural networks was that a single architecture could learn anything by varying its weights. This work shows that NNs can learn nearly as well by keeping constant weights and varying the architecture. Structure matters more than originally believed.
2 replies, 100 likes

Jun 12 2019 Julian Togelius

Very cool work!
1 replies, 38 likes

Aug 09 2019 George A Constantinides

This is very nice work, and what a wonderfully interactive way to write it up!
1 replies, 35 likes

Jun 12 2019 Marco Salvi

Impressive and fascinating work!
1 replies, 34 likes

Jun 12 2019 Suzana Ilić

Wow. 🤯
1 replies, 32 likes

Jun 15 2019 Albert Cardona

When neural circuit architecture matters more than synaptic weights: “Weight agnostic neural networks”, Gaier & Ha, 2019
2 replies, 31 likes

Jun 12 2019 Pablo Samuel Castro

very cool and neat idea, and the presentation of the article on the github site is just fantastic. @hardmaru i wonder how this type of idea would work on value-based methods, where your network is encoding expected value (or distribution over values) for states instead of policy?
1 replies, 26 likes

Jun 20 2019 Heiga Zen (全 炳河)

When I first heard the idea from David, I was really impressed and keen to know how it would perform. I am lucky that his desk at Google is next to mine :-)
0 replies, 19 likes

Jun 16 2019 Arunava

.@hardmaru recently published Weight Agnostic Neural Networks. The paper shows the importance of the Architecture itself and gets more than 90% acc on MNIST test set with random weights :) Thanks David :) Paper: #MachineLearning #DeepLearning #AI
0 replies, 17 likes

Sep 06 2019 Erwin Coumans

Congrats Adam Gaier and David @hardmaru Ha! Exciting results and a new experimentation framework (also for learning 3D locomotion) by Brain Tokyo! This is not a 'paper weight'.
0 replies, 16 likes

Aug 22 2019 Andres Torrubia

@TonyZador @NPCollapse I loved the paper. You may already know about weight-agnostic ANNs, which may be "encoded" in the genetic bottleneck by the equivalent of a random seed (?)
1 replies, 14 likes

Jun 12 2019 samim

Fascinating work!
1 replies, 14 likes

Jun 17 2019 Kevin Mitchell

@albertcardona So interesting! Instincts and abilities can get wired into the architecture of neural circuits so as to function w/o learning by the individual. (In animals, this is because evolution has done the deepest of deep learning already...) cc @hardmaru @GaryMarcus
2 replies, 14 likes

Jul 17 2019 Alexandros Goulas

Artificial neuronal networks showcasing the importance of network topology | optimizing topology only - not weights - is sufficient to generate meaningful behavior |
0 replies, 13 likes

Jun 12 2019 Joshua Achiam

Amazing work, really lovely concept.
1 replies, 11 likes

Aug 22 2019 Kevin Mitchell

@PabloRedux @MelMitchell1 @svalver @sd_marlow @GaryMarcus @TonyZador Very relevant paper (on idea of selecting neural architectures for types of learning - i.e., meta-learning through evolution): Weight Agnostic Neural Networks
1 replies, 9 likes

Sep 06 2019 Suzana Ilić

0 replies, 9 likes

Aug 06 2019 Kwabena Boahen

Super intriguing use of evolutionary algorithms to search for topologies instead of stochastic gradient descent to search for weights.
0 replies, 7 likes

Jun 14 2019 Namhoon Lee

Seemingly a great work here; it would have been great if SNIP was mentioned in their related work as a pruning method for "untrained, randomly initialized neural networks".
1 replies, 6 likes

Jun 12 2019 Giovanni Petrantoni

We are working on a product very much inspired by this and neuroevolution in general. MNIST was a huge challenge for me in terms of CPU optimizations. Given you work at google and google cloud is probably cheap for you :) I wonder, how many VMs you had to throw at it?
0 replies, 6 likes

Jun 17 2019 Albert Cardona

One of the authors of “Weight agnostic neural networks” is on twitter:
1 replies, 6 likes

Aug 16 2019 Your Personal Neurocrackpot

Broke: Weighted connections & functional topologies Woke: Architectures & strong inductive biases Bespoke: Cybernetic infrastructures that attune systems to contexts and organise percolations of conjugate flows to generate "self-imploding explosions" in combinatorial search 💩
0 replies, 5 likes

Jun 13 2019 Abi Aryan

Excellent paper with a very interesting idea..
1 replies, 5 likes

Jun 12 2019 Statistics Papers

Weight Agnostic Neural Networks.
0 replies, 4 likes

Jun 13 2019 Dileep George

Interesting work on discovering idiosyncratic circuits for specific tasks. As I argue in this talk, this is the opposite of building general intelligence. Current DL can be thought of as a systematic way to discover idiosyncratic circuits.
1 replies, 4 likes

Jun 12 2019 Massimo Quadrana

Fix a random shared weight and search for the best architecture... Really fascinating work!
0 replies, 4 likes

Jun 18 2019 BioDecoded

Weight Agnostic Neural Networks | arXiv #DeepLearning
0 replies, 4 likes

Jun 12 2019 Rebel Science

Deep learning experts should not claim to be inspired by biology. Unlike DNNs, brains don't process data but changes in the environment, aka transitions. The brain is mainly a massive timing mechanism. It doesn't optimize objective functions: no gradients, backprop or labels.
0 replies, 4 likes

Jun 12 2019 Pablo Cordero

Reminds me of liquid state machines and echo state networks, essentially finding good kernels to bootstrap from. @hardmaru 's interactive article is, as usual, on a class of it's own, the interactive demos are superb.
0 replies, 3 likes

Sep 06 2019 Kevin Mitchell

@VenkRamaswamy @KordingLab See also: Weight Agnostic Neural Networks
0 replies, 3 likes

Jun 12 2019 Daniel Situnayake

This is so cool! Shifting complexity from weights to architecture, so a model can (kinda) work with random weights and can be trained from that starting point. I love that it was inspired by biology:
0 replies, 3 likes

Jun 17 2019 Sandeep Kishore

This is so true especially of motor networks. Good to go, relatively speaking, from the start. Interesting thread based on "weight agnostic neural networks" from @hardmaru
0 replies, 3 likes

Dec 05 2019 hardmaru

Adam Gaier will be presenting our paper on “Weight Agnostic Neural Networks” spotlight presentation (Dec 10th 5:25pm) → poster session (Dec 11th) → paper → tweetstorm ↓
0 replies, 3 likes

Jun 12 2019 Claus Aranha

Why worry about the little weights, when we can focus on the network structure instead? Fantastic work by @hardmaru, and a very easy to read page/paper. I love the idea of ensembles of untrained weights.
0 replies, 2 likes

Jun 13 2019 OGAWA, Tadashi

=> Weight Agnostic Neural Networks, Google, arXiv, Jun 11, 2019 Neural network architectures that can already perform a task without any explicit weight training On supervised learning domain Interactive Demo
1 replies, 2 likes

Dec 17 2019 小猫遊りょう(たかにゃし・りょう)

Weight Agnostic Neural Networks
1 replies, 2 likes

Aug 28 2019 Dan Buscombe

There are now two ways to use artificial neural networks to solve data-driven problems: 1. Design a model architecture and have the computer learn optimal weights (trad), or now 2. Have the computer learn how to build the architecture from the bricks you feed it.
1 replies, 2 likes

Jul 16 2019 bitcraft lab

Interesting article on Weight Agonostic Neural Networks by @hardmaru >>> … <<< makes me want to revisit Stuart Kauffman's Random Boolean Networks and look into Boolean Neural Networks :) #selforganisation #neuralnet #RBN #BNN
0 replies, 1 likes

Sep 06 2019 OGAWA, Tadashi

=> "Weight Agnostic Neural Networks", Google Brain, arXiv, Sep 5, 2019 (v2) How important are the weight parameters of a NN? "has been accepted as a spotlight presentation at NeurIPS2019", Sep 6, 2019
1 replies, 1 likes

Aug 07 2019 Bhav Ashok

Architecture: Brain as Weights: Learned knowledge, the structure compensates for what the environment/supervision/optimization lacks
0 replies, 1 likes

Aug 28 2019 Susheel Busi

0 replies, 1 likes

Jun 12 2019 Lana Sinapayen

Great idea here: focusing on architecture rather than weights. A bit like the innate/learned spectrum
0 replies, 1 likes

Jun 12 2019 Selim

1 replies, 1 likes

Jun 13 2019 Hamlet 🇩🇴

"This idea came out after a few drinks in Roppongi." The Unreasonable Effectiveness of Alcohol 🥃 👇👇👇🤯
0 replies, 1 likes

Jun 12 2019 Pierre Richemond

Some try to see the future, others embody it :) Congrats Adam and David !
1 replies, 0 likes

Jun 12 2019 Brundage Bot

Weight Agnostic Neural Networks. Adam Gaier and David Ha
1 replies, 0 likes