Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MELD

Benchmarks

Task NameDataset NameSOTA ResultTrend
Emotion Recognition in ConversationMELD (test)
Weighted F170.44
143
Emotion Recognition in ConversationMELD
Weighted Avg F169.15
137
Multimodal Emotion Recognition in ConversationMELD standard (test)
WF167.6
53
Multimodal Emotion Recognition in ConversationMELD
Weighted Avg F1 Score69
36
Emotion DetectionMELD (test)
Weighted-F10.699
32
Emotion RecognitionMELD (test)
Weighted F170.12
28
Emotion Recognition in ConversationMELD Standard (test)
Weighted F169.15
19
Speech Emotion RecognitionMELD
Accuracy63.5
19
Emotion Recognition in ConversationMELD 1.0 (test)
Weighted F165.61
17
Emotion Recognition in ConversationMELD
F1 Score67.29
16
Cross-scenario Multimodal Emotion Recognition in ConversationsMELD -> IEMOCAP noise rate 40% (test)
Joy Accuracy41.46
15
Sentiment ClassificationMELD (test)
Accuracy68.5
15
Speech Emotion RecognitionMELD In-Domain v1 (test)
Accuracy54.06
14
Emotion RecognitionMELD
UACC64.34
12
Conversational Emotion RecognitionMELD (test)
Macro F1 Score61.9
12
Multimodal Emotion RecognitionMELD Natural Distribution
Accuracy57.43
9
Multimodal Emotion RecognitionMELD IID Setting
Accuracy55.82
9
Emotion RecognitionMELD (held-out)
F1 Score71.1
8
Activation TaskMeld-S
AUAC63.5
8
Multimodal Emotional DialogueMELD EmoOmniEval (test)
VS-RES1.42
7
Sentiment Analysis (SEN)MELD S
F1 (Binary Weighted)78.5
7
Emotion Recognition (EMO)MELD E
Mean Weighted Accuracy71.1
7
Emotion Recognition in ConversationMELD
F1 (Neutral)76.92
7
Multi-modal Sentiment Analysis Classification (MSAC)MELD
Neutral Accuracy0.8005
7
Multimodal semantics discoveryMELD-DA (test)
NMI23.22
6
Showing 25 of 34 rows