Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Morphological Inflection Generation with Hard Monotonic Attention

About

We present a neural model for morphological inflection generation which employs a hard attention mechanism, inspired by the nearly-monotonic alignment commonly found between the characters in a word and the characters in its inflection. We evaluate the model on three previously studied morphological inflection generation datasets and show that it provides state of the art results in various setups compared to previous neural and non-neural approaches. Finally we present an analysis of the continuous representations learned by both the hard and soft attention \cite{bahdanauCB14} models for the task, shedding some light on the features such models extract.

Roee Aharoni, Yoav Goldberg• 2016

Related benchmarks

TaskDatasetResultRank
Spell CorrectionTwitter dataset (test)
Whole-Word Accuracy52.2
5
OCR CorrectionFinnish newspaper corpus (test)
Whole-Word Accuracy58.4
5
Showing 2 of 2 rows

Other info

Follow for update