Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Pre-training of Graph Augmented Transformers for Medication Recommendation

About

Medication recommendation is an important healthcare application. It is commonly formulated as a temporal prediction task. Hence, most existing works only utilize longitudinal electronic health records (EHRs) from a small number of patients with multiple visits ignoring a large number of patients with a single visit (selection bias). Moreover, important hierarchical knowledge such as diagnosis hierarchy is not leveraged in the representation learning process. To address these challenges, we propose G-BERT, a new model to combine the power of Graph Neural Networks (GNNs) and BERT (Bidirectional Encoder Representations from Transformers) for medical code representation and medication recommendation. We use GNNs to represent the internal hierarchical structures of medical codes. Then we integrate the GNN representation into a transformer-based visit encoder and pre-train it on EHR data from patients only with a single visit. The pre-trained visit encoder and representation are then fine-tuned for downstream predictive tasks on longitudinal EHRs from patients with multiple visits. G-BERT is the first to bring the language model pre-training schema into the healthcare domain and it achieved state-of-the-art performance on the medication recommendation task.

Junyuan Shang, Tengfei Ma, Cao Xiao, Jimeng Sun• 2019

Related benchmarks

TaskDatasetResultRank
Readmission predictionMIMIC IV
AUC-ROC0.6538
70
In-hospital mortality predictionMIMIC IV
AUROC0.9315
57
Readmission predictionMIMIC-III (target)
AUPRC57.19
35
In-hospital mortality predictionMIMIC-III
AUPRC72.13
25
Mortality PredictioneICU
AUC-ROC0.8928
22
Prolonged Length of Stay (PLOS) PredictioneICU
F1 Score65.73
7
Prolonged Length of Stay (PLOS) PredictionMIMIC-III
F1 Score69.62
7
Prolonged Length of Stay (PLOS) PredictionMIMIC IV
F1 Score61.27
7
Showing 8 of 8 rows

Other info

Follow for update