Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Taking Shortcuts for Categorical VQA Using Super Neurons

About

Sparse Attention Vectors (SAVs) have emerged as an excellent training-free alternative to supervised finetuning or low-rank adaptation to improve the performance of Vision Language Models (VLMs). At their heart, SAVs select a few accurate attention heads for a task of interest and use them as classifiers, rather than relying on the model's prediction. In a similar spirit, we find that directly probing the raw activations of the VLM, in the form of scalar values, is sufficient to yield accurate classifiers on diverse visually grounded downstream tasks. Shifting focus from attention vectors to scalar activations dramatically increases the search space for accurate parameters, allowing us to find more discriminative neurons immediately from the first generated token. We call such activations Super Neurons (SNs). In this probing setting, we discover that enough SNs appear in the shallower layers of the large language model to allow for extreme early exiting from the first layer of the model at the first generated token. Compared to the original network, SNs robustly improve the classification performance while achieving a speedup of up to 5.10x.

Pierre Musacchio, Jaeyi Jeong, Dahun Kim, Jaesik Park• 2026

Related benchmarks

TaskDatasetResultRank
Visual Question AnsweringA-OKVQA (val)
Accuracy0.852
88
Visual Question AnsweringVizwiz (val)
VQA Score81.2
51
Visual Question AnsweringCLEVR (val)
Overall Accuracy88.3
21
Visual Question AnsweringPOPE (val)
Accuracy93.7
6
Visual Question AnsweringInstaOrder Depth (val)
Accuracy66.1
6
Visual Question AnsweringScienceQA (val)
Accuracy82.9
6
Visual Question AnsweringInstaOrder Occ. (val)
Accuracy78.2
6
Categorical Visual Question AnsweringPOPE
Accuracy96.1
4
Categorical Visual Question AnsweringInstaOrder Depth
Accuracy63.5
4
Categorical Visual Question AnsweringInstaOrder Occ.
Accuracy0.627
4
Showing 10 of 14 rows

Other info

Follow for update