| Task Name | Dataset Name | SOTA Result | Trend | |
|---|---|---|---|---|
| Medical Visual Question Answering | VQA-RAD | Accuracy80.4 | 106 | |
| Visual Question Answering | VQA-RAD | Closed Accuracy86.8 | 49 | |
| Medical Visual Question Answering | VQA-RAD closed-end | Accuracy84.86 | 45 | |
| Visual Question Answering | VQA-RAD (test) | Open-ended Accuracy74.9 | 33 | |
| Medical Visual Question Answering | VQA-RAD Open | Accuracy61.5 | 26 | |
| Hallucination Detection | VQA-RAD (All) | AUC63.99 | 21 | |
| Hallucination Detection | VQA-RAD Open-Ended | AUC70 | 21 | |
| Medical Visual Question Answering | VQA-RAD (test) | Accuracy70.7 | 13 | |
| Multi-modal Question Answering | VQA-RAD | Accuracy87.1 | 12 | |
| Medical Visual Question Answering | VQA-RAD cross-domain | Accuracy0.789 | 10 | |
| Medical Visual Question Answering | VQA-RAD (in-domain) | Accuracy83.3 | 10 | |
| Medical Visual Question Answering | VQA-Rad 2018 | Accuracy87.05 | 7 | |
| Reasoning | VQA-RAD | Correctness47.34 | 6 | |
| Medical Visual Question Answering | VQA-RAD (held-out) | Accuracy63.8 | 6 | |
| Biomedical Visual Question Answering | VQA-RAD (test) | Closed Accuracy87.1 | 4 | |
| Medical Visual Question Answering | VQA-Rad 2019 | Accuracy81.17 | 4 | |
| Medical Visual Question Answering | VQA-Rad (All) | Accuracy80.37 | 3 |