Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unsupervised Neural Word Segmentation for Chinese via Segmental Language Modeling

About

Previous traditional approaches to unsupervised Chinese word segmentation (CWS) can be roughly classified into discriminative and generative models. The former uses the carefully designed goodness measures for candidate segmentation, while the latter focuses on finding the optimal segmentation of the highest generative probability. However, while there exists a trivial way to extend the discriminative models into neural version by using neural language models, those of generative ones are non-trivial. In this paper, we propose the segmental language models (SLMs) for CWS. Our approach explicitly focuses on the segmental nature of Chinese, as well as preserves several properties of language models. In SLMs, a context encoder encodes the previous context and a segment decoder generates each segment incrementally. As far as we know, we are the first to propose a neural model for unsupervised CWS and achieve competitive performance to the state-of-the-art statistical models on four different datasets from SIGHAN 2005 bakeoff.

Zhiqing Sun, Zhi-Hong Deng• 2018

Related benchmarks

TaskDatasetResultRank
Chinese Word SegmentationPKU (test)
F180.2
32
Word SegmentationMSR Chinese (test)
F1 Score79.4
17
Word SegmentationCITYU Chinese (test)
F1 Score80.5
16
Word SegmentationAS Chinese (test)
F-measure80.3
13
Showing 4 of 4 rows

Other info

Follow for update