Self-Alignment Pretraining for Biomedical Entity Representations
About
Despite the widespread success of self-supervised learning via masked language models (MLM), accurately capturing fine-grained semantic relationships in the biomedical domain remains a challenge. This is of paramount importance for entity-level tasks such as entity linking where the ability to model entity relations (especially synonymy) is pivotal. To address this challenge, we propose SapBERT, a pretraining scheme that self-aligns the representation space of biomedical entities. We design a scalable metric learning framework that can leverage UMLS, a massive collection of biomedical ontologies with 4M+ concepts. In contrast with previous pipeline-based hybrid systems, SapBERT offers an elegant one-model-for-all solution to the problem of medical entity linking (MEL), achieving a new state-of-the-art (SOTA) on six MEL benchmarking datasets. In the scientific domain, we achieve SOTA even without task-specific supervision. With substantial improvement over various domain-specific pretrained MLMs such as BioBERT, SciBERTand and PubMedBERT, our pretraining scheme proves to be both effective and robust.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Question Answering | MedQA-USMLE (test) | Accuracy37.2 | 101 | |
| Semantic Textual Similarity | BIOSSES | Spearman Correlation82.48 | 40 | |
| Information Retrieval | COVID | nDCG@1033.4 | 37 | |
| Feature Selection Alignment | Expert-labeled disease feature relevance dataset | T1D Alignment Score75.8 | 26 | |
| Relatedness Detection | Multi-institutional EHR dataset | AUC0.847 | 25 | |
| Concept Similarity Detection | Multi-institutional EHR dataset | AUC89.7 | 25 | |
| Feature selection evaluation | GPT-4 Feature Relevance Estimation Suite Silver Standard (test) | T1D Score56.8 | 25 | |
| Clinical Relatedness Detection | General Clinical Relation Pairs | AUC75.3 | 25 | |
| Clinical Similarity Detection | General Clinical Relation Pairs | AUC0.803 | 25 | |
| Cross-institutional code mapping | UPMC LAB-LOINC | Spearman's Rank Correlation0.529 | 24 |