Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

About

In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out validation dataset for model selection, which is largely ignored by previous works. Therefore, our work can serve as a BERT-based benchmark for E2E-ABSA.

Xin Li, Lidong Bing, Wenxuan Zhang, Wai Lam• 2019

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionConll 2003
F1 Score91.76
86
Named Entity RecognitionWnut 2017
F1 Score55.76
79
Aspect-Term Sentiment AnalysisLAPTOP SemEval 2014 (test)
Macro-F161.12
69
Named Entity RecognitionWeiboNER
F1 Score69.53
27
Entity-Level Financial Sentiment AnalysisEFSA
F1 Score73.77
23
Sequence LabelingRestaurant 14
F1 Score74
20
Sequence LabelingRestaurant 16
F1 Score71.47
20
Sequence LabelingRestaurant15
F1 Score61.58
20
Sequence LabelingLaptop14
F1 Score61.33
20
End-to-End Aspect-Based Sentiment AnalysisREST SemEval 2015 2016 (test)
F1 Score74.72
10
Showing 10 of 13 rows

Other info

Code

Follow for update