Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Simple but Effective BERT Model for Dialog State Tracking on Resource-Limited Systems

About

In a task-oriented dialog system, the goal of dialog state tracking (DST) is to monitor the state of the conversation from the dialog history. Recently, many deep learning based methods have been proposed for the task. Despite their impressive performance, current neural architectures for DST are typically heavily-engineered and conceptually complex, making it difficult to implement, debug, and maintain them in a production setting. In this work, we propose a simple but effective DST model based on BERT. In addition to its simplicity, our approach also has a number of other advantages: (a) the number of parameters does not grow with the ontology size (b) the model can operate in situations where the domain ontology may change dynamically. Experimental results demonstrate that our BERT-based model outperforms previous methods by a large margin, achieving new state-of-the-art results on the standard WoZ 2.0 dataset. Finally, to make the model small and fast enough for resource-restricted systems, we apply the knowledge distillation method to compress our model. The final compressed model achieves comparable results with the original model while being 8x smaller and 7x faster.

Tuan Manh Lai, Quan Hung Tran, Trung Bui, Daisuke Kihara• 2019

Related benchmarks

TaskDatasetResultRank
Dialogue State TrackingWOZ 2.0 (test)
Joint Goal Accuracy90.5
65
Showing 1 of 1 rows

Other info

Follow for update