Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Portuguese Named Entity Recognition using BERT-CRF

About

Recent advances in language representation using neural networks have made it viable to transfer the learned internal states of a trained model to downstream natural language processing tasks, such as named entity recognition (NER) and question answering. It has been shown that the leverage of pre-trained language models improves the overall performance on many tasks and is highly beneficial when labeled data is scarce. In this work, we train Portuguese BERT models and employ a BERT-CRF architecture to the NER task on the Portuguese language, combining the transfer capabilities of BERT with the structured predictions of CRF. We explore feature-based and fine-tuning training strategies for the BERT model. Our fine-tuning approach obtains new state-of-the-art results on the HAREM I dataset, improving the F1-score by 1 point on the selective scenario (5 NE classes) and by 4 points on the total scenario (10 NE classes).

F\'abio Souza, Rodrigo Nogueira, Roberto Lotufo• 2019

Related benchmarks

TaskDatasetResultRank
Legal Text ClassificationCAIL
Accuracy88.81
18
Legal Text ClassificationLEVEN
Accuracy72.94
18
Legal Text ClassificationQA
Accuracy79.17
18
Legal Text ClassificationOverruling
Accuracy95.88
18
Legal Text ClassificationLEDGAR
Accuracy86.35
18
Showing 5 of 5 rows

Other info

Follow for update