Multi-Task Neural Models for Translating Between Styles Within and Across Languages
About
Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.
Xing Niu, Sudha Rao, Marine Carpuat• 2018
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Formality Style Transfer | GYAFC Entertainment & Music 1.0 (test) | BLEURT0.023 | 15 | |
| Formality Style Transfer | GYAFC Family & Relationships 1.0 (test) | BLEU0.568 | 15 | |
| Formality Style Transfer | GYAFC Entertainment & Music (test) | BLEU72.01 | 10 | |
| Formality Style Transfer | GYAFC Family & Relationships (test) | BLEU75.35 | 10 |
Showing 4 of 4 rows