Papers of the day   All papers

Towards Learning Convolutions from Scratch

Comments

Behnam Neyshabur: 💡💡What is the best acc an MLP can get on CIFAR10❓ 65%❓ No, 85%‼️ Trying to understand convolutions, we look at MDL and come up with a variant of LASSO that when applied to MLPs, it learns local connections and achieves amazing accuracy! Paper: https://arxiv.org/abs/2007.13657 1/n https://t.co/ijrX9CFJ41

11 replies, 818 likes


hardmaru: Towards Learning Convolutions from Scratch “As ML moves towards reducing the expert bias and learning it from data, a natural next step seems to be learning convolution-like structures from scratch.” Would be great to find the "ConvNet" for new domains. https://arxiv.org/abs/2007.13657 https://t.co/vYM3DjOXDC

3 replies, 228 likes


Preetum Nakkiran: .@bneyshabur will be speaking about this paper on Thursday 4pm ET at our ML Theory seminar. Our seminars are highly interactive, so do join if you want to ask questions! Register on the mailing list for zoom link: https://mltheory.org/

1 replies, 42 likes


Brandon Rohrer: 🚨 Useful Neural Networks Paper Alert 🚨 This Beta-LASSO work just out from @bneyshabur Behnam Neyshabur is eye opening.

1 replies, 34 likes


Andreas Madsen: Great dissection of Conv-NN vs MLP.

0 replies, 13 likes


Irina Rish: LASSO rules! :)

0 replies, 9 likes


Rohan Anil: I really like this work. Excited to read more, MLP is all you need.

0 replies, 8 likes


RahimEntezari: Is it possible to learn local connectivity without convoltions? Do we need overparametrized networks to achieve best results? Check this interesting paper by @bneyshabur

0 replies, 6 likes


Utku: Very interesting!

0 replies, 3 likes


Robert Y. Chen: Really important work from @bneyshabur ! The approach is guided by understanding WHY it works, and engineering it with a biologically inspired approach of emergence, rather than human hand coding. Really beautiful work, congratulations!

0 replies, 2 likes


HotComputerScience: Most popular computer science paper of the day: "Towards Learning Convolutions from Scratch" https://hotcomputerscience.com/paper/towards-learning-convolutions-from-scratch https://twitter.com/bneyshabur/status/1287936315829313536

0 replies, 2 likes


Adam M. Smith: When tried training a binary neural network on MNIST with description length constraints using a MaxSAT solver (in 2013) the learned filters looked a lot like what they visualize for β-LASSO here:

1 replies, 2 likes


Hamid EBZD: Towards Learning Convolutions from Scratch https://arxiv.org/abs/2007.13657

1 replies, 0 likes


Content

Found on Jul 28 2020 at https://arxiv.org/pdf/2007.13657.pdf

PDF content of a computer science paper: Towards Learning Convolutions from Scratch