Papers of the day   All papers

A mathematical theory of semantic development in deep neural networks

Comments

Oct 28 2018 Surya Ganguli

1/ New #deeplearning paper at the intersection of #AI #mathematics #psychology and #neuroscience: A mathematical theory of semantic development in deep neural networks: https://arxiv.org/abs/1810.10531 Thanks to awesome collaborators Andrew Saxe and Jay McClelland! https://t.co/ibmjvlAGlU
4 replies, 492 likes


May 18 2019 Surya Ganguli

Our paper on "A mathematical theory of semantic development in deep neural networks" with Andrew Saxe and Jay McClelland is now out in @PNASNews: https://www.pnas.org/content/early/2019/05/16/1820226116 (arxiv version here: https://arxiv.org/abs/1810.10531) And old tweetstorm here: https://twitter.com/SuryaGanguli/status/1056606478285455360
2 replies, 293 likes


Jun 29 2019 Clément Farabet

[1/4] Vacationing, and I really enjoyed reading this fantastic paper on the dynamics of learning in DNNs. The authors focus on deep linear networks to show how with just 2 linearly stacked layers, learning is naturally decomposed in stages, following complexity of the data.
1 replies, 97 likes


May 21 2019 Fei-Fei Li

A freshly out-of-the-oven @StanfordHAI work highlighting the interdisciplinary thinking of #AI, psychology and neuroscience!
1 replies, 68 likes


Jun 27 2019 Naomi Saphra

"A mathematical theory of semantic development in deep neural networks" http://arxiv.org/abs/1810.10531 is really cool! Saxe et al. theoretically demonstrate why even in linear networks, extra layers will bias SGD towards learning broad categories before fine-grained distinctions. https://t.co/RQ2TIsZP23
0 replies, 19 likes


Jun 27 2019 Kovas Boguta

A truly interesting paper https://t.co/yHangRlA6k
0 replies, 8 likes


May 18 2019 Steph Nelli

Such a cool paper
0 replies, 1 likes


Content