Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Joint Multiple Intent Detection and Slot Filling via Self-distillation

About

Intent detection and slot filling are two main tasks in natural language understanding (NLU) for identifying users' needs from their utterances. These two tasks are highly related and often trained jointly. However, most previous works assume that each utterance only corresponds to one intent, ignoring the fact that a user utterance in many cases could include multiple intents. In this paper, we propose a novel Self-Distillation Joint NLU model (SDJN) for multi-intent NLU. First, we formulate multiple intent detection as a weakly supervised problem and approach with multiple instance learning (MIL). Then, we design an auxiliary loop via self-distillation with three orderly arranged decoders: Initial Slot Decoder, MIL Intent Decoder, and Final Slot Decoder. The output of each decoder will serve as auxiliary information for the next decoder. With the auxiliary knowledge provided by the MIL Intent Decoder, we set Final Slot Decoder as the teacher model that imparts knowledge back to Initial Slot Decoder to complete the loop. The auxiliary loop enables intents and slots to guide mutually in-depth and further boost the overall NLU performance. Experimental results on two public multi-intent datasets indicate that our model achieves strong performance compared to others.

Lisong Chen, Peilin Zhou, Yuexian Zou• 2021

Related benchmarks

TaskDatasetResultRank
Joint Multiple Intent Detection and Slot FillingMixSNIPS (test)
Slot F195.4
57
Joint Multiple Intent Detection and Slot FillingMixATIS (test)
F1 Score (Slot)88.2
42
Multi-intent Natural Language UnderstandingMixATIS (test)
Overall Accuracy46.3
16
Showing 3 of 3 rows

Other info

Follow for update