Papers of the day   All papers

Proving the Lottery Ticket Hypothesis: Pruning is All You Need

Comments

Daniel Roy: Cool result, though the level of overparametrization seems too large to explain the empirical LTH phenomenon. https://arxiv.org/abs/2002.00585 https://t.co/SIntk7tcIx

3 replies, 130 likes


Statistics Papers: Proving the Lottery Ticket Hypothesis: Pruning is All You Need. http://arxiv.org/abs/2002.00585

0 replies, 33 likes


Hacker News: Proving the Lottery Ticket Hypothesis: Pruning is All You Need https://arxiv.org/abs/2002.00585

0 replies, 15 likes


Daisuke Okanohara: Over parametrized NN with random weights contains a subnetwork that can approximate any target network (half depth) behavior arbitrarily well, stronger statement than the lottery ticket hypothesis. We can train NN just by pruning w/o weight tuning. https://arxiv.org/abs/2002.00585

0 replies, 13 likes


Dimitris Papailiopoulos: A great recent work by Malach et al. [https://arxiv.org/pdf/2002.00585.pdf] establishes the first theoretical analysis for this phenomenon that they refer to as the "strong LTH": one can provably approximate a net of width d, depth l, by pruning a random one that is O(d^4 * l^2) times wider.

1 replies, 7 likes


Content

Found on Feb 05 2020 at https://arxiv.org/pdf/2002.00585.pdf

PDF content of a computer science paper: Proving the Lottery Ticket Hypothesis: Pruning is All You Need