Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Playing with Words at the National Library of Sweden -- Making a Swedish BERT

About

This paper introduces the Swedish BERT ("KB-BERT") developed by the KBLab for data-driven research at the National Library of Sweden (KB). Building on recent efforts to create transformer-based BERT models for languages other than English, we explain how we used KB's collections to create and train a new language-specific BERT model for Swedish. We also present the results of our model in comparison with existing models - chiefly that produced by the Swedish Public Employment Service, Arbetsf\"ormedlingen, and Google's multilingual M-BERT - where we demonstrate that KB-BERT outperforms these in a range of NLP tasks from named entity recognition (NER) to part-of-speech tagging (POS). Our discussion highlights the difficulties that continue to exist given the lack of training data and testbeds for smaller languages like Swedish. We release our model for further exploration and research here: https://github.com/Kungbib/swedish-bert-models .

Martin Malmsten, Love B\"orjeson, Chris Haffenden• 2020

Related benchmarks

TaskDatasetResultRank
Acceptability ClassificationDaLAJ (test)
Accuracy74.2
12
Acceptability ClassificationDaLAJ (dev)
Accuracy71.9
12
Showing 2 of 2 rows

Other info

Follow for update