Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Hierarchical Pretraining on Multimodal Electronic Health Records

About

Pretraining has proven to be a powerful technique in natural language processing (NLP), exhibiting remarkable success in various NLP downstream tasks. However, in the medical domain, existing pretrained models on electronic health records (EHR) fail to capture the hierarchical nature of EHR data, limiting their generalization capability across diverse downstream tasks using a single pretrained model. To tackle this challenge, this paper introduces a novel, general, and unified pretraining framework called MEDHMP, specifically designed for hierarchically multimodal EHR data. The effectiveness of the proposed MEDHMP is demonstrated through experimental results on eight downstream tasks spanning three levels. Comparisons against eighteen baselines further highlight the efficacy of our approach.

Xiaochen Wang, Junyu Luo, Jiaqi Wang, Ziyi Yin, Suhan Cui, Yuan Zhong, Yaqing Wang, Fenglong Ma• 2023

Related benchmarks

TaskDatasetResultRank
In-hospital mortality predictionMIMIC IV
AUROC0.9763
57
In-hospital mortality predictionMIMIC-III
AUPRC63.842
25
Showing 2 of 2 rows

Other info

Follow for update