Papers of the day   All papers

Sparse Networks from Scratch: Faster Training without Losing Performance

Comments

Jul 11 2019 Tim Dettmers

My new work with @LukeZettlemoyer on accelerated training of sparse networks from random weights to dense performance levels — no retraining required! Paper: https://arxiv.org/abs/1907.04840 Blog post: https://timdettmers.com/2019/07/11/sparse-networks-from-scratch/ Code: https://github.com/TimDettmers/sparse_learning https://t.co/2UDdhhWZhG
3 replies, 293 likes


Sep 13 2019 Tim Dettmers

The v1.0 Release of my sparse learning library includes a lot of bugfixes and more rigorous ImageNet baselines: https://github.com/TimDettmers/sparse_learning/releases/tag/v1.0 Our paper on accelerating training through sparsity was also updated with dense equivalent results and new analyses: https://arxiv.org/abs/1907.04840 https://t.co/My6kAiIA7K
0 replies, 78 likes


Aug 09 2019 ML Review

Sparse Networks from Scratch: Faster Training without Losing Performance By @Tim_Dettmers Finds "winning lottery tickets" – sparse configurations with 20% weights and similar performance. SoTA on MNIST, CIFAR-10, and ImageNet-2012 among sparse methods https://arxiv.org/abs/1907.04840 https://t.co/PLtz82zaYT
0 replies, 67 likes


Aug 08 2019 Tim Dettmers

Soon, I will also update our Sparse Networks from Scratch paper (https://arxiv.org/abs/1907.04840) with an analysis for momentum and pruning rate parameters showing that sparse momentum is highly stable and easy to use. Also included: A study on dense performance equivalents (see below). https://t.co/Pqba8fZ4Ya
2 replies, 56 likes


Jul 12 2019 Brian

Jeff Hawkins and his group have been working on making sparse networks, also based on inspiration from neuroscience. Have you read his 1000 brains work? https://www.youtube.com/watch?v=5LFo36g4Lug
1 replies, 8 likes


Jul 14 2019 Thaddeus Preston

While we complain about #Bond25 and a Government created #BorderCrisis there are others working on slowing building the future. #AI #MachineLearning
0 replies, 1 likes


Content