Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Pretrained Mobility Transformer: A Foundation Model for Human Mobility

About

Ubiquitous mobile devices are generating vast amounts of location-based service data that reveal how individuals navigate and utilize urban spaces in detail. In this study, we utilize these extensive, unlabeled sequences of user trajectories to develop a foundation model for understanding urban space and human mobility. We introduce the \textbf{P}retrained \textbf{M}obility \textbf{T}ransformer (PMT), which leverages the transformer architecture to process user trajectories in an autoregressive manner, converting geographical areas into tokens and embedding spatial and temporal information within these representations. Experiments conducted in three U.S. metropolitan areas over a two-month period demonstrate PMT's ability to capture underlying geographic and socio-demographic characteristics of regions. The proposed PMT excels across various downstream tasks, including next-location prediction, trajectory imputation, and trajectory generation. These results support PMT's capability and effectiveness in decoding complex patterns of human mobility, offering new insights into urban spatial functionality and individual mobility preferences.

Xinhua Wu, Haoyu He, Yanchao Wang, Qi Wang• 2024

Related benchmarks

TaskDatasetResultRank
Mobility PredictionHiroshima
DTW4.85e+3
13
Human mobility predictionSapporo
Acc@128.78
13
Mobility PredictionSapporo
DTW3.80e+3
13
Human mobility predictionHiroshima
Acc@128.5
13
Mobility PredictionKumamoto
DTW4.54e+3
13
Human mobility predictionKumamoto
Acc@126.97
13
Showing 6 of 6 rows

Other info

Follow for update