Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Self-Alignment Pretraining for Biomedical Entity Representations

About

Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.

Fangyu Liu, Ehsan Shareghi, Zaiqiao Meng, Marco Basaldella, Nigel Collier• 2020

Related benchmarks

TaskDatasetResultRank
Question AnsweringMedQA-USMLE (test)
Accuracy37.2
101
Semantic Textual SimilarityBIOSSES
Spearman Correlation82.48
40
Information RetrievalCOVID
nDCG@1033.4
37
Feature Selection AlignmentExpert-labeled disease feature relevance dataset
T1D Alignment Score75.8
26
Relatedness DetectionMulti-institutional EHR dataset
AUC0.847
25
Concept Similarity DetectionMulti-institutional EHR dataset
AUC89.7
25
Feature selection evaluationGPT-4 Feature Relevance Estimation Suite Silver Standard (test)
T1D Score56.8
25
Clinical Relatedness DetectionGeneral Clinical Relation Pairs
AUC75.3
25
Clinical Similarity DetectionGeneral Clinical Relation Pairs
AUC0.803
25
Cross-institutional code mappingUPMC LAB-LOINC
Spearman's Rank Correlation0.529
24
Showing 10 of 39 rows

Other info

Follow for update