Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Mitigating Object Hallucination via Concentric Causal Attention

About

Recent Large Vision Language Models (LVLMs) present remarkable zero-shot conversational and reasoning capabilities given multimodal queries. Nevertheless, they suffer from object hallucination, a phenomenon where LVLMs are prone to generate textual responses not factually aligned with image inputs. Our pilot study reveals that object hallucination is closely tied with Rotary Position Encoding (RoPE), a widely adopted positional dependency modeling design in existing LVLMs. Due to the long-term decay in RoPE, LVLMs tend to hallucinate more when relevant visual cues are distant from instruction tokens in the multimodal input sequence. Additionally, we observe a similar effect when reversing the sequential order of visual tokens during multimodal alignment. Our tests indicate that long-term decay in RoPE poses challenges to LVLMs while capturing visual-instruction interactions across long distances. We propose Concentric Causal Attention (CCA), a simple yet effective positional alignment strategy that mitigates the impact of RoPE long-term decay in LVLMs by naturally reducing relative distance between visual and instruction tokens. With CCA, visual tokens can better interact with instruction tokens, thereby enhancing model's perception capability and alleviating object hallucination. Without bells and whistles, our positional alignment method surpasses existing hallucination mitigation strategies by large margins on multiple object hallucination benchmarks.

Yun Xing, Yiheng Li, Ivan Laptev, Shijian Lu• 2024

Related benchmarks

TaskDatasetResultRank
Object Hallucination EvaluationPOPE
Accuracy86.5
1455
Multimodal EvaluationMME--
658
Hallucination EvaluationMMHal-Bench
MMHal Score1.92
216
Hallucination EvaluationAMBER--
172
Object Hallucination EvaluationCHAIR
CS Score43
108
Object Hallucination EvaluationMSCOCO POPE
Random Accuracy89.77
47
Hallucination EvaluationMME Hallucination
Existence Score190
39
Hallucination assessmentObject-HalBench
Mention Hallucination Rate23.8
39
Multimodal Capability EvaluationMM-Star
Average Score32.1
36
Object Hallucination AssessmentMSCOCO (500 random samples)
Cs48.6
25
Showing 10 of 14 rows

Other info

Follow for update