Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Object-Centric Learning with Slot Attention

About

Learning object-centric representations of complex scenes is a promising step towards enabling efficient abstract reasoning from low-level perceptual features. Yet, most deep learning approaches learn distributed representations that do not capture the compositional properties of natural scenes. In this paper, we present the Slot Attention module, an architectural component that interfaces with perceptual representations such as the output of a convolutional neural network and produces a set of task-dependent abstract representations which we call slots. These slots are exchangeable and can bind to any object in the input by specializing through a competitive procedure over multiple rounds of attention. We empirically demonstrate that Slot Attention can extract object-centric representations that enable generalization to unseen compositions when trained on unsupervised object discovery and supervised property prediction tasks.

Francesco Locatello, Dirk Weissenborn, Thomas Unterthiner, Aravindh Mahendran, Georg Heigold, Jakob Uszkoreit, Alexey Dosovitskiy, Thomas Kipf• 2020

Related benchmarks

TaskDatasetResultRank
Object-Centric LearningMOVi-C
MBO^i26.2
29
Object-Centric Representation LearningVOC
mBOi24.6
28
Object-Centric Representation LearningCOCO
mBOi17.2
27
Unsupervised Object SegmentationCOCO
mBOi17.2
26
Object-Centric LearningMOVi-E
MBO^i24
22
Unsupervised Object SegmentationCLEVRTEX 1.0 (test)
FG-ARI62.4
20
Unsupervised Object SegmentationMOVi-C
FG-ARI49.54
18
Object DiscoveryMOVi-C
mBOi26.2
18
Object-Centric LearningPascal
MBO^i24.6
18
Unsupervised Object SegmentationPascal
MBO^i0.246
17
Showing 10 of 71 rows
...

Other info

Follow for update