Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Induction of Finite-State Transducers

About

Finite-State Transducers (FSTs) are effective models for string-to-string rewriting tasks, often providing the efficiency necessary for high-performance applications, but constructing transducers by hand is difficult. In this work, we propose a novel method for automatically constructing unweighted FSTs following the hidden state geometry learned by a recurrent neural network. We evaluate our methods on real-world datasets for morphological inflection, grapheme-to-phoneme prediction, and historical normalization, showing that the constructed FSTs are highly accurate and robust for many datasets, substantially outperforming classical transducer learning algorithms by up to 87% accuracy on held-out test sets.

Michael Ginn, Alexis Palmer, Mans Hulden• 2026

Related benchmarks

TaskDatasetResultRank
Grapheme-to-PhonemeSIGMORPHON 2020 (test)--
10
Historical normalizationhun historical normalization (test)
Accuracy31.6
4
Historical normalizationswe historical normalization (test)
Accuracy0.579
4
Historical normalizationslv historical normalization (test)
Accuracy80.1
4
Historical normalizationspa historical normalization (test)
Accuracy64.6
4
Morphological Inflectionaka morphological inflection (test)
Accuracy97.5
4
Morphological Inflectionceb morphological inflection (test)
Accuracy86.5
4
Morphological Inflectioncrh morphological inflection (test)
Accuracy88.8
4
Morphological Inflectionczn morphological inflection (test)
Accuracy66.6
4
Morphological Inflectiondje morphological inflection (test)
Accuracy75
4
Showing 10 of 30 rows

Other info

Follow for update