Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling

About

Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition. In this work, we propose an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech understanding and dialog systems. Unlike in machine translation and speech recognition, alignment is explicit in slot filling. We explore different strategies in incorporating this alignment information to the encoder-decoder framework. Learning from the attention mechanism in encoder-decoder model, we further propose introducing attention to the alignment-based RNN models. Such attentions provide additional information to the intent classification and slot label prediction. Our independent task models achieve state-of-the-art intent detection error rate and slot filling F1 score on the benchmark ATIS task. Our joint training model further obtains 0.56% absolute (23.8% relative) error reduction on intent detection and 0.23% absolute gain on slot filling over the independent task models.

Bing Liu, Ian Lane• 2016

Related benchmarks

TaskDatasetResultRank
Joint Multiple Intent Detection and Slot FillingMixSNIPS (test)
Slot F189.4
57
Slot FillingATIS (test)
F1 Score95.98
55
Joint Multiple Intent Detection and Slot FillingMixATIS (test)
F1 Score (Slot)86.4
42
Slot Filling and Intent DetectionMixSNIPS
Overall Accuracy59.5
31
Natural Language UnderstandingSnips (test)
Intent Acc96.7
27
Intent DetectionATIS--
27
Slot FillingSnips (test)
F1 Score0.878
25
Hate speech classification and explainabilityHateXplain (test)
IOU F10.167
22
Slot FillingM2M
Micro F191.72
18
Intent DetectionM2M
Accuracy92.5
18
Showing 10 of 23 rows

Other info

Follow for update