Papers of the day   All papers

ON THE RELATIONSHIP BETWEEN SELF-ATTENTION AND CONVOLUTIONAL LAYERS

Comments

Jan 10 2020 Jean-Baptiste Cordonnier

Very happy to share our latest work accepted at #ICRL2020: we prove that a Self-Attention layer can express any CNN layer. 1/5 📄Paper: https://openreview.net/pdf?id=HJlnC1rKPB 🍿Interactive website : https://epfml.github.io/attention-cnn/ 🖥Code: https://github.com/epfml/attention-cnn 📝Blog: http://jbcordonnier.com/posts/attention-cnn/ https://t.co/X1rNS1JvPt
5 replies, 1151 likes


Jan 11 2020 hardmaru

Self-attention is proving to be a really good unified prior for both image and sequence processing. It can prob also learn useful representations for images that are difficult for conv layers to learn. See author’s thread for blog post and web demo: https://twitter.com/jb_cordonnier/status/1215581826187743232?s=21
1 replies, 139 likes


Jan 11 2020 Torsten Scholak

After reading @chrmanning et al’s paper on where bert looks at, https://arxiv.org/abs/1906.04341, this makes intuitive sense to me
1 replies, 75 likes


Jan 11 2020 Maks Sorokin 🦾

@hardmaru @iclr_conf @jb_cordonnier Author's tweet: https://twitter.com/jb_cordonnier/status/1215581826187743232?s=21
0 replies, 6 likes


Content