Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Slot Attention-based Feature Filtering for Few-Shot Learning

About

Irrelevant features can significantly degrade few-shot learn ing performance. This problem is used to match queries and support images based on meaningful similarities despite the limited data. However, in this process, non-relevant fea tures such as background elements can easily lead to confu sion and misclassification. To address this issue, we pro pose Slot Attention-based Feature Filtering for Few-Shot Learning (SAFF) that leverages slot attention mechanisms to discriminate and filter weak features, thereby improving few-shot classification performance. The key innovation of SAFF lies in its integration of slot attention with patch em beddings, unifying class-aware slots into a single attention mechanism to filter irrelevant features effectively. We intro duce a similarity matrix that computes across support and query images to quantify the relevance of filtered embed dings for classification. Through experiments, we demon strate that Slot Attention performs better than other atten tion mechanisms, capturing discriminative features while reducing irrelevant information. We validate our approach through extensive experiments on few-shot learning bench marks: CIFAR-FS, FC100, miniImageNet and tieredIma geNet, outperforming several state-of-the-art methods.

Javier Rodenas, Eduardo Aguilar, Petia Radeva• 2025

Related benchmarks

TaskDatasetResultRank
5-way Few-shot Image ClassificationCIFAR-FS
Mean Accuracy90.26
30
5-way Few-shot Image ClassificationFC100
Mean Accuracy66.22
20
Showing 2 of 2 rows

Other info

Follow for update