Papers of the day   All papers

Mish: A Self Regularized Non-Monotonic Neural Activation Function

Comments

Oct 14 2019 Jeremy Howard

This new activation function has seen quite a bit of success in the @fastdotai community already, including generating a forum discussion with over 500 posts (including many from the author of the paper)! https://forums.fast.ai/t/meet-mish-new-activation-function-possible-successor-to-relu/53299
10 replies, 638 likes


Oct 13 2019 ML Review

Mish: Self Regularized Non-Monotonic Activation Function By @DigantaMisra1 𝑓(𝑥)=𝑥⋅𝑡𝑎𝑛ℎ(𝑠𝑜𝑓𝑡𝑝𝑙𝑢𝑠(𝑥)) Increased accuracy over Swish/ReLU Increased performance over Swish Github https://github.com/digantamisra98/Mish ArXiv https://arxiv.org/abs/1908.08681v2 https://t.co/zlyQ0hwggt
3 replies, 257 likes


Content