Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TPP-LLM: Modeling Temporal Point Processes by Efficiently Fine-Tuning Large Language Models

About

Temporal point processes (TPPs) are widely used to model the timing and occurrence of events in domains such as social networks, transportation systems, and e-commerce. In this paper, we introduce TPP-LLM, a novel framework that integrates large language models (LLMs) with TPPs to capture both the semantic and temporal aspects of event sequences. Unlike traditional methods that rely on categorical event type representations, TPP-LLM directly utilizes the textual descriptions of event types, enabling the model to capture rich semantic information embedded in the text. While LLMs excel at understanding event semantics, they are less adept at capturing temporal patterns. To address this, TPP-LLM incorporates temporal embeddings and employs parameter-efficient fine-tuning (PEFT) methods to effectively learn temporal dynamics without extensive retraining. This approach improves both predictive accuracy and computational efficiency. Experimental results across diverse real-world datasets demonstrate that TPP-LLM outperforms state-of-the-art baselines in sequence modeling and event prediction, highlighting the benefits of combining LLMs with TPPs.

Zefang Liu, Yinzhu Quan• 2024

Related benchmarks

TaskDatasetResultRank
Event PredictionStackOverflow
RMSE0.464
42
Event sequence modelingChicago Crime
Accuracy27.3
13
Event sequence modelingUS Earthquake
Accuracy64.1
13
Event sequence modelingNYC Taxi
Accuracy91.8
13
Event sequence modelingAmazon Review
Accuracy (%)69.5
13
Multimodal Temporal Point Process predictionDanmakuTPP (test)
RMSE5.3035
9
Multimodal Temporal Point Process predictionTAXI-PRO (test)
RMSE0.3336
9
Showing 7 of 7 rows

Other info

Follow for update