Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

VQA-RAD

Benchmarks

Task NameDataset NameSOTA ResultTrend
Medical Visual Question AnsweringVQA-RAD
Accuracy80.4
198
Medical Visual Question AnsweringVQA-RAD (Closed)
ECE1.3
96
Visual Question AnsweringVQA-RAD (Open)
AUROC0.819
96
Visual Question AnsweringVQA-RAD Closed
AUROC70.2
96
Visual Question AnsweringVQA-RAD
Closed Accuracy86.8
64
Visual Question AnsweringVQA-RAD (test)
Open-ended Accuracy74.9
46
Medical Visual Question AnsweringVQA-RAD closed-end
Accuracy84.86
45
Hallucination DetectionVQA-RAD (All)
AUC67.7
41
Hallucination DetectionVQA-RAD Open-Ended
AUC72.9
41
Medical Visual Question AnsweringVQA-RAD (test)
Accuracy79.8
38
Medical Visual Question AnsweringVQA-RAD Open
Accuracy61.5
26
Multimodal Medical ReasoningVQA-RAD
Accuracy (%)80.45
18
Multi-modal Question AnsweringVQA-RAD
Accuracy87.1
12
Medical Visual Question AnsweringVQA-RAD cross-domain
Accuracy0.789
10
Medical Visual Question AnsweringVQA-RAD (in-domain)
Accuracy83.3
10
Medical Visual Question AnsweringVQA-Rad 2018
Accuracy87.05
7
Medical Visual Question AnsweringVQA-RAD
L-VASE94.4
6
ReasoningVQA-RAD
Correctness47.34
6
Medical Visual Question AnsweringVQA-RAD (held-out)
Accuracy63.8
6
Biomedical Visual Question AnsweringVQA-RAD (test)
Closed Accuracy87.1
4
Medical Visual Question AnsweringVQA-Rad 2019
Accuracy81.17
4
Medical Visual Question AnsweringVQA-RAD
BLEU-10.438
3
Medical Visual Question AnsweringVQA-Rad (All)
Accuracy80.37
3
Showing 23 of 23 rows