Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking
About
Injecting external domain-specific knowledge (e.g., UMLS) into pretrained language models (LMs) advances their capability to handle specialised in-domain tasks such as biomedical entity linking (BEL). However, such abundant expert knowledge is available only for a handful of languages (e.g., English). In this work, by proposing a novel cross-lingual biomedical entity linking task (XL-BEL) and establishing a new XL-BEL benchmark spanning 10 typologically diverse languages, we first investigate the ability of standard knowledge-agnostic as well as knowledge-enhanced monolingual and multilingual LMs beyond the standard monolingual English BEL task. The scores indicate large gaps to English performance. We then address the challenge of transferring domain-specific knowledge in resource-rich languages to resource-poor ones. To this end, we propose and evaluate a series of cross-lingual transfer methods for the XL-BEL task, and demonstrate that general-domain bitext helps propagate the available English knowledge to languages with little to no in-domain data. Remarkably, we show that our proposed domain-specific transfer methods yield consistent gains across all target languages, sometimes up to 20 Precision@1 points, without any in-domain knowledge in the target language, and without any in-domain parallel data.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Entity Linking | QUAERO-MEDLINE french (test) | Recall@174.7 | 11 | |
| Entity Linking | QUAERO-EMEA french (test) | Recall@167.9 | 11 | |
| Entity Linking | MM-ST21PV english (test) | Recall@164.6 | 11 | |
| Entity Linking | SPACCC spanish (test) | Recall@147.9 | 11 | |
| Biomedical Entity Linking | XL-BEL 1.0 (test) | EN Precision@178.7 | 10 |