Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Teaching Large Language Models to Regress Accurate Image Quality Scores using Score Distribution

About

With the rapid advancement of Multi-modal Large Language Models (MLLMs), MLLM-based Image Quality Assessment (IQA) methods have shown promising performance in linguistic quality description. However, current methods still fall short in accurately scoring image quality. In this work, we aim to leverage MLLMs to regress accurate quality scores. A key challenge is that the quality score is inherently continuous, typically modeled as a Gaussian distribution, whereas MLLMs generate discrete token outputs. This mismatch necessitates score discretization. Previous approaches discretize the mean score into a one-hot label, resulting in information loss and failing to capture inter-image relationships. We propose a distribution-based approach that discretizes the score distribution into a soft label. This method preserves the characteristics of the score distribution, achieving high accuracy and maintaining inter-image relationships. Moreover, to address dataset variation, where different IQA datasets exhibit various distributions, we introduce a fidelity loss based on Thurstone's model. This loss captures intra-dataset relationships, facilitating co-training across multiple IQA datasets. With these designs, we develop the distribution-based Depicted image Quality Assessment model for Score regression (DeQA-Score). Experiments across multiple benchmarks show that DeQA-Score stably outperforms baselines in score regression. Also, DeQA-Score can predict the score distribution that closely aligns with human annotations. Codes and model weights have been released in https://depictqa.github.io/deqa-score/.

Zhiyuan You, Xin Cai, Jinjin Gu, Tianfan Xue, Chao Dong• 2025

Related benchmarks

TaskDatasetResultRank
Image Quality AssessmentSPAQ
SRCC0.934
250
Image Quality AssessmentCSIQ
SRC0.857
150
Image Quality AssessmentAGIQA-3K
SRCC0.745
131
Image Quality AssessmentKADID
SRCC69.4
128
Image Quality AssessmentKonIQ-10k
SRCC0.953
126
Image Quality AssessmentPIPAL
SRCC69
123
No-Reference Image Quality AssessmentCSIQ
SROCC0.847
121
Image Quality AssessmentKonIQ
SRCC0.946
119
Blind Image Quality AssessmentFLIVE
SRCC0.501
115
No-Reference Image Quality AssessmentTID 2013
SRCC0.132
105
Showing 10 of 68 rows

Other info

Follow for update