| Task Name | Dataset Name | SOTA Result | Trend | |
|---|---|---|---|---|
| Medical Visual Question Answering | VQA-RAD | Accuracy80.4 | 198 | |
| Medical Visual Question Answering | VQA-RAD (Closed) | ECE1.3 | 96 | |
| Visual Question Answering | VQA-RAD (Open) | AUROC0.819 | 96 | |
| Visual Question Answering | VQA-RAD Closed | AUROC70.2 | 96 | |
| Visual Question Answering | VQA-RAD | Closed Accuracy86.8 | 64 | |
| Visual Question Answering | VQA-RAD (test) | Open-ended Accuracy74.9 | 46 | |
| Medical Visual Question Answering | VQA-RAD closed-end | Accuracy84.86 | 45 | |
| Hallucination Detection | VQA-RAD (All) | AUC67.7 | 41 | |
| Hallucination Detection | VQA-RAD Open-Ended | AUC72.9 | 41 | |
| Medical Visual Question Answering | VQA-RAD (test) | Accuracy79.8 | 38 | |
| Medical Visual Question Answering | VQA-RAD Open | Accuracy61.5 | 26 | |
| Multimodal Medical Reasoning | VQA-RAD | Accuracy (%)80.45 | 18 | |
| Multi-modal Question Answering | VQA-RAD | Accuracy87.1 | 12 | |
| Medical Visual Question Answering | VQA-RAD cross-domain | Accuracy0.789 | 10 | |
| Medical Visual Question Answering | VQA-RAD (in-domain) | Accuracy83.3 | 10 | |
| Medical Visual Question Answering | VQA-Rad 2018 | Accuracy87.05 | 7 | |
| Medical Visual Question Answering | VQA-RAD | L-VASE94.4 | 6 | |
| Reasoning | VQA-RAD | Correctness47.34 | 6 | |
| Medical Visual Question Answering | VQA-RAD (held-out) | Accuracy63.8 | 6 | |
| Biomedical Visual Question Answering | VQA-RAD (test) | Closed Accuracy87.1 | 4 | |
| Medical Visual Question Answering | VQA-Rad 2019 | Accuracy81.17 | 4 | |
| Medical Visual Question Answering | VQA-RAD | BLEU-10.438 | 3 | |
| Medical Visual Question Answering | VQA-Rad (All) | Accuracy80.37 | 3 |