Papers of the day   All papers

NEURAL TANGENTS: FAST AND EASY INFINITE NEURAL NETWORKS IN PYTHON

Comments

Google AI: Announcing Neural Tangents, a new easy-to-use, open-source neural network library that enables researchers to build finite- and infinite-width versions of neural networks simultaneously. Grab the code and try it for yourself at https://goo.gle/33eErSu https://t.co/bL6nQL2PoR

16 replies, 2100 likes


Sam Schoenholz: Neural Tangents is an open source library we've been working on to make it easy to build, train, and manipulate infinitely wide neural networks. To appear as a spotlight at ICLR. code: http://github.com/google/neural-tangents paper: https://arxiv.org/abs/1912.02803 colab: https://colab.research.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb

5 replies, 930 likes


Sam Schoenholz: After a ton of work by a bunch of people, we're releasing an entirely new Neural Tangents. Paper: https://arxiv.org/abs/1912.02803 Github: https://github.com/google/neural-tangents Colab Notebook: https://colab.sandbox.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb

6 replies, 523 likes


Jascha: Infinite width networks (NNGPs and NTKs) are the most promising lead for theoretical understanding in deep learning. But, running experiments with them currently resembles the dark age of ML research before ubiquitous automatic differentiation. Neural Tangents fixes that. https://twitter.com/sschoenholz/status/1202988151569973248

2 replies, 282 likes


hardmaru: Neural Tangents is a Python library designed to enable research into “infinite-width” neural networks. They provide an API for specifying complex neural network architectures that can then be trained and evaluated in their infinite-width limit. 🙉🤯 https://arxiv.org/abs/1912.02803

1 replies, 129 likes


Statistics Papers: Neural Tangents: Fast and Easy Infinite Neural Networks in Python. http://arxiv.org/abs/1912.02803

0 replies, 30 likes


Jascha: Paper: https://arxiv.org/abs/1912.02803 Github: https://github.com/google/neural-tangents Colab Notebook: https://colab.research.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb

0 replies, 28 likes


Matthew McAteer: Easier “infinite-width” neural networks. 🤯 Seems like a pretty brilliant way to compress larger NNs. Plus, they've got a colab notebook demo showing how to build them in Jax (which I guess is what all the cool kids are using these days). https://colab.research.google.com/github/google/neural-tangents/blob/master/notebooks/neural_tangents_cookbook.ipynb https://t.co/1PFQx3wcht

0 replies, 18 likes


MONTREAL.AI: Neural Tangents: Fast and Easy Infinite Neural Networks in Python Novak et al.: https://arxiv.org/abs/1912.02803 #DeepLearning #NeuralNetworks #Python https://t.co/JuzLIh5kxy

0 replies, 9 likes


Djaafar🇩🇿جعفر: Neural Tangents: Fast and Easy Infinite Neural Networks in Python https://arxiv.org/abs/1912.02803 #DeepLearning #NeuralNetworks #Python https://t.co/XG6zZeMtqr

0 replies, 1 likes


Content

Found on Mar 13 2020 at https://arxiv.org/pdf/1912.02803.pdf

PDF content of a computer science paper: NEURAL TANGENTS: FAST AND EASY INFINITE NEURAL NETWORKS IN PYTHON