| UD Treebank English 2.15 | | Accuracy91.9 | | 24 | 3d ago |
| UD Treebank Dutch 2.15 | | Accuracy94.7 | | 24 | 3d ago |
| PTB (test) | Ling et al. | Accuracy97.78 | | 24 | 3d ago |
| Ritter11 T-POS (test) | ACE | Accuracy93.4 | | 20 | 3d ago |
| ARK-Twitter (test) | ACE | Accuracy94.4 | | 18 | 3d ago |
| Penn Treebank (PTB) Section 23 v2.2 (test) | jPTDP | POS Accuracy97.97 | | 15 | 3d ago |
| Turkish (TR) (test) | Stanza | Accuracy94.2 | | 14 | 3d ago |
| Universal Dependency Portuguese (test) | Stanza | Accuracy0.97 | | 11 | 3d ago |
| Universal Dependency Lithuanian (test) | Stanza | Accuracy93.4 | | 11 | 3d ago |
| Universal Dependency Finnish (test) | Stanza | Accuracy97 | | 11 | 3d ago |
| Universal Dependency Basque (test) | Stanza | Accuracy96.2 | | 11 | 3d ago |
| Universal Dependency Afrikaans (test) | Stanza | Accuracy97.6 | | 11 | 3d ago |
| WSJ (dev) | MT-Tri | Accuracy97.37 | | 11 | 3d ago |
| SANCL 1.0 (dev) | | Accuracy (Answers)90.3 | | 11 | 3d ago |
| Hebrew UD Corpus v2 (test) | mT5 | mset Accuracy97.46 | | 10 | 3d ago |
| SANCL (test) | | Accuracy (Answers)0.9121 | | 7 | 3d ago |
| POS tagging (myv) Few-Text (test) | Gold MLM + Both (Label Distillation) | F1 Score74.3 | | 6 | 3d ago |
| POS tagging mlt Few-Text (test) | Gold MLM + Pseudo MLM | F1 Score72.3 | | 6 | 3d ago |
| POS tagging glv Few-Text (test) | Gold MLM + Both (Label Distillation) | F1 Score68.8 | | 6 | 3d ago |
| Few-Text bam (test) | Gold MLM + Both (Label Distillation) | F1 Score69.4 | | 6 | 3d ago |
| French Treebank (FTB) SPMRL shared task (test) | CamemBERT | POS Accuracy98.2 | | 6 | 3d ago |
| French Treebank (FTB) SPMRL shared task (dev) | FlauBERT_BASE | POS Accuracy98.2 | | 6 | 3d ago |
| UD Treebank German 2.15 | Bregman CRF (BCRF) | Accuracy94.4 | | 5 | 3d ago |
| UD Treebank French 2.15 | Bregman CRF (BCRF) | Accuracy96.5 | | 5 | 3d ago |
| IMST | TURNA | Precision94.66 | | 5 | 3d ago |