Papers of the day   All papers

What Does BERT Look At? An Analysis of BERT’s Attention

Comments

Jan 10 2020 Jean-Baptiste Cordonnier

Very happy to share our latest work accepted at #ICRL2020: we prove that a Self-Attention layer can express any CNN layer. 1/5 📄Paper: https://openreview.net/pdf?id=HJlnC1rKPB 🍿Interactive website : https://epfml.github.io/attention-cnn/ 🖥Code: https://github.com/epfml/attention-cnn 📝Blog: http://jbcordonnier.com/posts/attention-cnn/ https://t.co/X1rNS1JvPt
5 replies, 1121 likes


Jun 27 2019 Kevin Clark

Code for our paper "What Does BERT Look At? An Analysis of BERT's Attention" (https://arxiv.org/abs/1906.04341) has been released! https://github.com/clarkkev/attention-analysis
1 replies, 377 likes


Jun 12 2019 Kevin Clark

Check out our new #BlackboxNLP paper "What Does BERT Look At? An Analysis of BERT's Attention" with @ukhndlwl @omerlevy @chrmanning! https://arxiv.org/abs/1906.04341 Among other things, we show that BERT's attention corresponds surprisingly well to aspects of syntax and coreference. https://t.co/SWh1qMIKX1
1 replies, 364 likes


Jun 21 2019 Tuhin Chakrabarty

https://arxiv.org/pdf/1906.04341.pdf What Does BERT Look At? An Analysis of BERT’s Attention. Really awesome analysis paper from @stanfordnlp . Must read
0 replies, 213 likes


Aug 02 2019 Stanford NLP Group

What Does BERT Look At? An Analysis of BERT's Attention. Kevin Clark @clark_kev, Urvashi Khandelwal @ukhndlwl, Omer Levy @omerlevy_, Christopher D. Manning @chrmanning. https://arxiv.org/abs/1906.04341
2 replies, 207 likes


Jun 12 2019 Jelle Zuidema

Interesting paper from Kevin Clark et al from StanfordNLP & FAIR, using attention maps and diagnostic classifiers ("probing classifiers") to assess what linguistic information is processed where and how in the BERT's attention heads. https://twitter.com/clark_kev/status/1138871422053376000
1 replies, 129 likes


Jan 11 2020 Torsten Scholak

After reading @chrmanning et al’s paper on where bert looks at, https://arxiv.org/abs/1906.04341, this makes intuitive sense to me
1 replies, 75 likes


Jun 30 2019 elvis

BERT in one paragraph. I love this kind of clear and concise writing. 😍 https://arxiv.org/abs/1906.04341 Clark et al., 2019 https://t.co/eNwv6OVk2U
0 replies, 40 likes


Aug 01 2019 Jelle Zuidema

Congratulations, @clark_kev , @chrmanning and colleagues with this year's #BlackboxNLP best paper award! https://twitter.com/wzuidema/status/1138923655885459457
0 replies, 19 likes


Jun 12 2019 Jelle Zuidema

Authors of that new paper (https://arxiv.org/abs/1906.04341) are @clark_kev @ukhndlwl @omerlevy @chrmanning Some more BERTology papers include: https://arxiv.org/abs/1906.01698 https://arxiv.org/abs/1905.05950 and our own: https://arxiv.org/abs/1906.01539 Many will be presented at @ACL2019_Italy and #BlackboxNLP
2 replies, 6 likes


Jun 13 2019 Mihail Eric

I’m a big fan of this line of work showing that our new fancy models are picking up on the same phenomena as old school #nlproc pipelines. Nice work @clark_kev @ukhndlwl!
0 replies, 2 likes


Jun 13 2019 Jonathan Raiman

This is wild
0 replies, 2 likes


Aug 02 2019 reza mahmoudi

What Does BERT Look At? An Analysis of BERT's Attention Clark et al.: https://arxiv.org/abs/1906.04341 #ArtificialIntelligence #DeepLearning #NLP #NaturalLanguageProcessing #AI #MachineLearning https://t.co/4JdRwqL1fY
1 replies, 1 likes


Jun 22 2019 sileye ba

What BERT looks at? https://arxiv.org/pdf/1906.04341.pdf
0 replies, 1 likes


Sep 23 2019 Aakash Kumar Nain 🔎

@pbloemesquire https://arxiv.org/abs/1906.04341
0 replies, 1 likes


Content