Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

UniMTS: Unified Pre-training for Motion Time Series

About

Motion time series collected from mobile and wearable devices such as smartphones and smartwatches offer significant insights into human behavioral patterns, with wide applications in healthcare, automation, IoT, and AR/XR due to their low-power, always-on nature. However, given security and privacy concerns, building large-scale motion time series datasets remains difficult, preventing the development of pre-trained models for human activity analysis. Typically, existing models are trained and tested on the same dataset, leading to poor generalizability across variations in device location, device mounting orientation and human activity type. In this paper, we introduce UniMTS, the first unified pre-training procedure for motion time series that generalizes across diverse device latent factors and activities. Specifically, we employ a contrastive learning framework that aligns motion time series with text descriptions enriched by large language models. This helps the model learn the semantics of time series to generalize across activities. Given the absence of large-scale motion time series data, we derive and synthesize time series from existing motion skeleton data with all-joint coverage. Spatio-temporal graph networks are utilized to capture the relationships across joints for generalization across different device locations. We further design rotation-invariant augmentation to make the model agnostic to changes in device mounting orientations. Our model shows exceptional generalizability across 18 motion time series classification benchmark datasets, outperforming the best baselines by 340% in the zero-shot setting, 16.3% in the few-shot setting, and 9.2% in the full-shot setting.

Xiyuan Zhang, Diyan Teng, Ranak Roy Chowdhury, Shuheng Li, Dezhi Hong, Rajesh K. Gupta, Jingbo Shang• 2024

Related benchmarks

TaskDatasetResultRank
Activity RecognitionPAMAP2
Accuracy47.2
22
Human Activity RecognitionPAMAP2 (test)
Accuracy98
21
Activity RecognitionmHealth
F1 Score61.8
17
Human Activity RecognitionMRI
Accuracy95
16
Human Activity RecognitionTotalCapture
Accuracy66
16
Human Activity RecognitionUCI-HAR
Accuracy92.1
15
Activity RecognitionOpportunity
Accuracy52.7
9
Activity RecognitionRealWorld
Accuracy55.5
9
Activity RecognitionUCI-HAR
Accuracy37.6
9
Activity RecognitionMotionSense
Accuracy45.2
9
Showing 10 of 41 rows

Other info

Code

Follow for update