Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Multi-lingual Intent Detection and Slot Filling in a Joint BERT-based Model

About

Intent Detection and Slot Filling are two pillar tasks in Spoken Natural Language Understanding. Common approaches adopt joint Deep Learning architectures in attention-based recurrent frameworks. In this work, we aim at exploiting the success of "recurrence-less" models for these tasks. We introduce Bert-Joint, i.e., a multi-lingual joint text classification and sequence labeling framework. The experimental evaluation over two well-known English benchmarks demonstrates the strong performances that can be obtained with this model, even when few annotated data is available. Moreover, we annotated a new dataset for the Italian language, and we observed similar performances without the need for changing the model.

Giuseppe Castellucci, Valentina Bellomaria, Andrea Favalli, Raniero Romagnoli• 2019

Related benchmarks

TaskDatasetResultRank
Intent Detection and Slot FillingSNIPS
Intent Accuracy99
4
Showing 1 of 1 rows

Other info

Follow for update