Share your thoughts, 1 month free Claude Pro on us
See more
Home
/
Benchmarks
Natural Language Inference on NLI-bias
Loading...
86.4
Accuracy
MABEL
73.296
76.698
80.1
83.502
Jun 6, 2024
Accuracy
Bias
Updated 1mo ago
Evaluation Results
Method
Method
Links
Accuracy
Bias
MABEL
Backbone=RoBERTa-base
2024.06
86.4
0.008
Vanilla-tuning
Backbone=RoBERTa-base
2024.06
85.9
0.021
EAR
Backbone=RoBERTa-base
2024.06
85.9
0.04
MABEL
Backbone=BERT-base
2024.06
81.3
0.03
EAR
Backbone=BERT-base
2024.06
79.6
0.013
Vanilla-tuning
Backbone=BERT-base
2024.06
79.5
0.021
Debiased-tuning
Backbone=RoBERTa-base
2024.06
77.4
0.015
Debiased-tuning
Backbone=BERT-base
2024.06
75.1
0.02
ProSocialTuning
Backbone=BERT-base
2024.06
74.7
0.012
ProSocialTuning
Backbone=RoBERTa-base
2024.06
73.8
0.013
Feedback
Search any
task
Search any
task