Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model

About

Electrocardiography (ECG) analysis is crucial for cardiac diagnosis, yet existing foundation models often fail to capture the periodicity and diverse features required for varied clinical tasks. We propose ECG-MoE, a hybrid architecture that integrates multi-model temporal features with a cardiac period-aware expert module. Our approach uses a dual-path Mixture-of-Experts to separately model beat-level morphology and rhythm, combined with a hierarchical fusion network using LoRA for efficient inference. Evaluated on five public clinical tasks, ECG-MoE achieves state-of-the-art performance with 40% faster inference than multi-task baselines.

Yuhao Xu, Xiaoda Wang, Yi Wu, Wei Jin, Xiao Hu, Carl Yang• 2026

Related benchmarks

TaskDatasetResultRank
Age EstimationECG (test)
MAE12.83
12
RR Interval EstimationECG (test)
MAE (ms)76.37
12
Sex ClassificationECG (test)
F1 Score0.69
12
Arrhythmia DetectionECG data 10,000 patients (test)
Accuracy (ACC)73
6
Potassium Abnormality PredictionECG data 10,000 patients (test)
F1 Score57
6
Showing 5 of 5 rows

Other info

Follow for update