Papers of the day   All papers

Object-Centric Learning with Slot Attention

Comments

Thomas Kipf: Excited to share our work @GoogleAI on Object-centric Learning with Slot Attention! Slot Attention is a simple module for structure discovery and set prediction: it uses iterative attention to group perceptual inputs into a set of slots. Paper: https://arxiv.org/abs/2006.15055 [1/7] https://t.co/0CzWO9B1fV

14 replies, 731 likes


Thomas Kipf: Happy to learn that our work on Slot Attention has been accepted for spotlight presentation at @NeurIPSConf!

4 replies, 388 likes


Thomas Kipf: We have released the code for Slot Attention (incl. pre-trained model checkpoints on CLEVR) Code: https://github.com/google-research/google-research/tree/master/slot_attention NeurIPS camera ready: https://arxiv.org/abs/2006.15055 https://t.co/1lNagRiKoy

1 replies, 386 likes


Francesco Locatello: Super excited to share what I’ve been working on in the past months during my internship at Google Brain in Amsterdam: "Object-Centric Learning with Slot Attention" https://arxiv.org/pdf/2006.15055.pdf @GoogleAI [1/7] https://t.co/1aNYLm4exj

4 replies, 341 likes


Thomas Kipf: SCOUTER: An explainable image classifier using a modified version of Slot Attention by Liangzhi Li et al. (Osaka University) SCOUTER: https://arxiv.org/abs/2009.06138 Slot Attention: https://arxiv.org/abs/2006.15055 https://t.co/u6MfKTwoMj

2 replies, 206 likes


Alexandr Kalinin: There is already a #PyTorch implementation of the Slot Attention module: https://github.com/lucidrains/slot-attention

0 replies, 187 likes


Francesco Locatello: We have finally released the code for Slot Attention (+ some checkpoints) and updated the paper! Code: https://github.com/google-research/google-research/tree/master/slot_attention #NeurIPS2020 camera ready: https://arxiv.org/abs/2006.15055 https://t.co/SyqWuJckv5

0 replies, 124 likes


Francesco Locatello: Really cool to see Slot Attention powering explainable image classifiers! When writing the broader impact statement for #NeurIPS2020, we were hoping to see developments in this direction. I'm thrilled Liangzhi et al. did it.

1 replies, 42 likes


Andrew Davison: Slot attention sounds like an interesting concept for structure discovery.

0 replies, 24 likes


Anirudh Goyal: I liked this work by @thomaskipf and @FrancescoLocat8. They use RIM style slot based top-down attention, with the caveat that all the "slots" share the same parameters (via recurrence also, and that provides equivariance). I'm glad you guys tried it! :)

0 replies, 15 likes


Peter Steinbach: Does this mark the ☀️rise of unsupervised segmentation? Would love to try this with scientific data opposed to natural scenes! Volunteers? @martweig @uschmidt83 @sagzehn @helmholtz_ai @noreenwalk @haesleinhuepf Still crawling this if multiple instances of a single class work too.

2 replies, 13 likes


Sungjin Ahn: So excited that we have @thomaskipf as our invited speaker in the ICML Workshop on Object-Oriented Learning (WOOL)! Looking forward to the talk! Join the ICML WOOL!

0 replies, 9 likes


Patrick Emami: Transformer-like attention can be used for perceptual grouping! We saw it used prev for learning patch-like object-part representations via capsules in SCAE. This approach seems to lose IODINE’s translation equivariance tho—important for sequences 🤔

1 replies, 4 likes


SLAM-Hub: Object-Centric Learning with Slot Attention (Under review) Paper : https://arxiv.org/abs/2006.15055

0 replies, 2 likes


Daisuke Okanohara: Slot attention is a map from N input feature vectors to K output vectors (slots) using dot-product attention and iterative routing (c.f., capsule), and is input-perm. invariant and output-perm. equivariant, ideal for representing a set of objects. https://arxiv.org/abs/2006.15055

0 replies, 2 likes


Content

Found on Jun 29 2020 at https://arxiv.org/pdf/2006.15055.pdf

PDF content of a computer science paper: Object-Centric Learning with Slot Attention