Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Representations from Incomplete EHR Data with Dual-Masked Autoencoding

About

Learning from electronic health records (EHRs) time series is challenging due to irregular sam- pling, heterogeneous missingness, and the resulting sparsity of observations. Prior self-supervised meth- ods either impute before learning, represent missingness through a dedicated input signal, or optimize solely for imputation, reducing their capacity to efficiently learn representations that support clinical downstream tasks. We propose the Augmented-Intrinsic Dual-Masked Autoencoder (AID-MAE), which learns directly from incomplete time series by applying an intrinsic missing mask to represent naturally missing values and an augmented mask that hides a subset of observed values for reconstruction during training. AID-MAE processes only the unmasked subset of tokens and consistently outperforms strong baselines, including XGBoost and DuETT, across multiple clinical tasks on two datasets. In addition, the learned embeddings naturally stratify patient cohorts in the representation space.

Xiao Xiang, David Restrepo, Hyewon Jeong, Yugang Jia, Leo Anthony Celi• 2026

Related benchmarks

TaskDatasetResultRank
In-hospital mortality predictionMIMIC IV
AUROC0.877
13
Acute Kidney Injury PredictionPhysioNet Challenge 2012
AUROC0.773
12
Mortality PredictionPhysioNet Challenge 2012
AUROC0.782
6
Mortality PredictionPhysioNet Challenge 2012 (test)
AUROC0.786
6
Length of StayMIMIC-IV ICU
AUROC0.776
5
Showing 5 of 5 rows

Other info

Follow for update