Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Task Cross-Lingual Sequence Tagging from Scratch

About

We present a deep hierarchical recurrent neural network for sequence tagging. Given a sequence of words, our model employs deep gated recurrent units on both character and word levels to encode morphology and context information, and applies a conditional random field layer to predict the tags. Our model is task independent, language independent, and feature engineering free. We further extend our model to multi-task and cross-lingual joint training by sharing the architecture and parameters. Our model achieves state-of-the-art results in multiple languages on several benchmark tasks including POS tagging, chunking, and NER. We also demonstrate that multi-task and cross-lingual joint training can improve the performance in various cases.

Zhilin Yang, Ruslan Salakhutdinov, William Cohen• 2016

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 2003 (test)
F1 Score91.62
539
ChunkingCoNLL 2000 (test)
F1 Score94.66
88
Tag predictionTwitter dataset
Precision@129.61
6
Showing 3 of 3 rows

Other info

Follow for update