Papers of the day   All papers

Mish: A Self Regularized Non-Monotonic Neural Activation Function

Comments

Jeremy Howard: This new activation function has seen quite a bit of success in the @fastdotai community already, including generating a forum discussion with over 500 posts (including many from the author of the paper)! https://forums.fast.ai/t/meet-mish-new-activation-function-possible-successor-to-relu/53299

10 replies, 638 likes


ML Review: Mish: Self Regularized Non-Monotonic Activation Function By @DigantaMisra1 𝑓(𝑥)=𝑥⋅𝑡𝑎𝑛ℎ(𝑠𝑜𝑓𝑡𝑝𝑙𝑢𝑠(𝑥)) Increased accuracy over Swish/ReLU Increased performance over Swish Github https://github.com/digantamisra98/Mish ArXiv https://arxiv.org/abs/1908.08681v2 https://t.co/zlyQ0hwggt

3 replies, 257 likes


Trending Papers: [10/10] 📈 - Mish: A Self Regularized Non-Monotonic Neural Activation Function - 525 ⭐ - 📄 https://arxiv.org/pdf/1908.08681v2.pdf - 🔗 https://github.com/digantamisra98/Mish

0 replies, 4 likes


Diganta Misra: Trending on @paperswithcode at 10th position. :) #DeepLearning

2 replies, 4 likes


Content

Found on Oct 14 2019 at https://arxiv.org/pdf/1908.08681v2.pdf

PDF content of a computer science paper: Mish: A Self Regularized Non-Monotonic Neural Activation Function