Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Self-Alignment Pretraining for Biomedical Entity Representations

About

Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.

Fangyu Liu, Ehsan Shareghi, Zaiqiao Meng, Marco Basaldella, Nigel Collier• 2020

Related benchmarks

TaskDatasetResultRank
Question AnsweringMedQA-USMLE (test)
Accuracy37.2
101
Biomedical Entity LinkingNCBI
Acc@192.3
20
Biomedical Entity LinkingCOMETA
Acc@175.1
20
Biomedical Entity LinkingAAP
Accuracy@189
15
Biomedical Entity LinkingBC5CDR
Accuracy @188.6
15
Biomedical Entity LinkingMM-ST21pv
Acc@150.3
13
Entity LinkingQUAERO-MEDLINE french (test)
Recall@150.6
11
Entity LinkingQUAERO-EMEA french (test)
Recall@149.8
11
Entity LinkingSPACCC spanish (test)
Recall@133.9
11
Entity LinkingMM-ST21PV english (test)
Recall@151.1
11
Showing 10 of 15 rows

Other info

Follow for update