Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation

About

Pre-trained language models (PLMs) have achieved great success in NLP and have recently been used for tasks in computational semantics. However, these tasks do not fully benefit from PLMs since meaning representations are not explicitly included in the pre-training stage. We introduce multilingual pre-trained language-meaning models based on Discourse Representation Structures (DRSs), including meaning representations besides natural language texts in the same model, and design a new strategy to reduce the gap between the pre-training and fine-tuning objectives. Since DRSs are language neutral, cross-lingual transfer learning is adopted to further improve the performance of non-English tasks. Automatic evaluation results show that our approach achieves the best performance on both the multilingual DRS parsing and DRS-to-text generation tasks. Correlation analysis between automatic metrics and human judgements on the generation task further validates the effectiveness of our model. Human inspection reveals that out-of-vocabulary tokens are the main cause of erroneous results.

Chunliu Wang, Huiyuan Lai, Malvina Nissim, Johan Bos• 2023

Related benchmarks

TaskDatasetResultRank
Text-to-DRS parsingPMB EN 4.0.0 (test)
F1 Score94.7
7
Text-to-DRS parsingPMB DE 4.0.0 (test)
F1 Score92
7
Text-to-DRS parsingPMB IT 4.0.0 (test)
F1 Score93.1
7
Text-to-DRS parsingPMB NL 4.0.0 (test)
F1 Score92.6
7
Showing 4 of 4 rows

Other info

Code

Follow for update