Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Language-TPP: Integrating Temporal Point Processes with Language Models for Event Analysis

About

Temporal Point Processes (TPPs) have been widely used for event sequence modeling, but they often struggle to incorporate rich textual event descriptions effectively. Conversely, while Large Language Models (LLMs) have been shown remarkable capabilities in processing textual data, they lack mechanisms for handling temporal dynamics. To bridge this gap, we introduce Language-TPP, a unified framework that integrates TPPs with LLMs for enhanced event sequence modeling. Language-TPP introduces a novel temporal encoding mechanism that converts continuous time intervals into specialized byte-tokens, enabling seamless integration with standard LLM architectures. This approach allows Language-TPP to achieve state-of-the-art performance across multiple TPP tasks, including event time prediction, type prediction, and intensity estimation, on five datasets. Additionally, we demonstrate that incorporating temporal information significantly improves the quality of generated event descriptions.

Quyu Kong, Yixuan Zhang, Yang Liu, Panrong Tong, Enqi Liu, Feng Zhou• 2025

Related benchmarks

TaskDatasetResultRank
Event PredictionStackOverflow
RMSE0.516
42
Event sequence modelingChicago Crime
Accuracy27.2
13
Event sequence modelingNYC Taxi
Accuracy92
13
Event sequence modelingAmazon Review
Accuracy (%)69.7
13
Event sequence modelingUS Earthquake
Accuracy64
13
Multimodal Temporal Point Process predictionDanmakuTPP (test)
RMSE5.3845
9
Multimodal Temporal Point Process predictionTAXI-PRO (test)
RMSE0.3376
9
Showing 7 of 7 rows

Other info

Follow for update