Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach

About

In recent years, neural paraphrase generation based on Seq2Seq has achieved superior performance, however, the generated paraphrase still has the problem of lack of diversity. In this paper, we focus on improving the diversity between the generated paraphrase and the original sentence, i.e., making generated paraphrase different from the original sentence as much as possible. We propose BTmPG (Back-Translation guided multi-round Paraphrase Generation), which leverages multi-round paraphrase generation to improve diversity and employs back-translation to preserve semantic information. We evaluate BTmPG on two benchmark datasets. Both automatic and human evaluation show BTmPG can improve the diversity of paraphrase while preserving the semantics of the original sentence.

Zhe Lin, Xiaojun Wan• 2021

Related benchmarks

TaskDatasetResultRank
Paraphrase GenerationQQP (test)--
22
Paraphrase GenerationMSCOCO (test)
Self-BLEU13.04
14
Paraphrase GenerationParalex (test)
BLEU28.4
11
Paraphrase GenerationParalex
iBLEU15.5
4
Paraphrase GenerationQQP
iBLEU9.13
4
Paraphrase GenerationMSCOCO
iBLEU13.2
4
Showing 6 of 6 rows

Other info

Follow for update