Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

About

A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner. We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion. We pre-train GPST on OpenWebText, a corpus with $9$ billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training.

Xiang Hu, Pengyu Ji, Qingyang Zhu, Wei Wu, Kewei Tu• 2024

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE (val)
SST-291.97
170
Abstractive Text SummarizationCNN/Daily Mail (test)
ROUGE-L26
169
Unsupervised ParsingPTB (test)--
75
Abstractive SummarizationGigaword (test)
ROUGE-133.19
58
Abstractive SummarizationXSum (test)
ROUGE-L25.58
44
Grammar InductionPTB English (test)
F1 Score57.46
29
Syntactic GeneralizationSG--
24
Showing 7 of 7 rows

Other info

Code

Follow for update