Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

OSF: On Pre-training and Scaling of Sleep Foundation Models

About

Polysomnography (PSG) provides the gold standard for sleep assessment but suffers from substantial heterogeneity across recording devices and cohorts. There have been growing efforts to build general-purpose foundation models (FMs) for sleep physiology, but lack an in-depth understanding of the pre-training process and scaling patterns that lead to more generalizable sleep FMs. To fill this gap, we curate a massive corpus of 166,500 hours of sleep recordings from nine public sources and establish SleepBench, a comprehensive, fully open-source benchmark. Leveraging SleepBench, we systematically evaluate four families of self-supervised pre-training objectives and uncover three critical findings: (1) existing FMs fail to generalize to missing channels at inference; (2) channel-invariant feature learning is essential for pre-training; and (3) scaling sample size, model capacity, and multi-source data mixture consistently improves downstream performance.With an enhanced pre-training and scaling recipe, we introduce OSF, a family of sleep FMs that achieves state-of-the-art performance across nine datasets on diverse sleep and disease prediction tasks. Further analysis of OSF also reveals intriguing properties in sample efficiency, hierarchical aggregation, and cross-dataset scaling.

Zitao Shuai, Zongzhe Xu, David Yang, Wei Wang, Yuzhe Yang• 2026

Related benchmarks

TaskDatasetResultRank
Arousal DetectionMROS
AUC83.9
21
Hypopnea DetectionMROS
AUC62.9
21
Oxygen Desaturation DetectionMROS
AUC68.4
21
Sleep StagingMROS
AUC94.5
21
Arousal DetectionMROS SleepBench (OOD evaluation cohort)
AUC94.6
15
Arousal DetectionSHHS
AUC95.7
15
Hypopnea DetectionSHHS
AUC87.6
15
Oxygen Desaturation DetectionMROS SleepBench (OOD evaluation cohort)
AUC83.5
15
Oxygen Desaturation DetectionSHHS
AUC83.7
15
Sleep StagingMROS SleepBench (OOD evaluation cohort)
AUC97.9
15
Showing 10 of 26 rows

Other info

Follow for update