Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

ETP: Learning Transferable ECG Representations via ECG-Text Pre-training

About

In the domain of cardiovascular healthcare, the Electrocardiogram (ECG) serves as a critical, non-invasive diagnostic tool. Although recent strides in self-supervised learning (SSL) have been promising for ECG representation learning, these techniques often require annotated samples and struggle with classes not present in the fine-tuning stages. To address these limitations, we introduce ECG-Text Pre-training (ETP), an innovative framework designed to learn cross-modal representations that link ECG signals with textual reports. For the first time, this framework leverages the zero-shot classification task in the ECG domain. ETP employs an ECG encoder along with a pre-trained language model to align ECG signals with their corresponding textual reports. The proposed framework excels in both linear evaluation and zero-shot classification tasks, as demonstrated on the PTB-XL and CPSC2018 datasets, showcasing its ability for robust and generalizable cross-modal ECG feature learning.

Che Liu, Zhongwei Wan, Sibo Cheng, Mi Zhang, Rossella Arcucci• 2023

Related benchmarks

TaskDatasetResultRank
ECG ClassificationPTBXL Super
Macro AUC49.63
84
ECG ClassificationPTBXL Form
Macro AUC57.44
18
ECG ClassificationPTBXL Sub
Macro AUC0.6225
18
ECG ClassificationPTBXL Rhythm
Macro AUC66.56
18
ECG ClassificationCSN
Macro AUC60.22
18
ECG ClassificationCPSC 2018
Macro AUC (1%)67.67
17
ECG ClassificationCPSC 2018
Macro AUC67.13
7
Showing 7 of 7 rows

Other info

Follow for update