Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TokenShapley: Token Level Context Attribution with Shapley Value

About

Large language models (LLMs) demonstrate strong capabilities in in-context learning, but verifying the correctness of their generated responses remains a challenge. Prior work has explored attribution at the sentence level, but these methods fall short when users seek attribution for specific keywords within the response, such as numbers, years, or names. To address this limitation, we propose TokenShapley, a novel token-level attribution method that combines Shapley value-based data attribution with KNN-based retrieval techniques inspired by recent advances in KNN-augmented LLMs. By leveraging a precomputed datastore for contextual retrieval and computing Shapley values to quantify token importance, TokenShapley provides a fine-grained data attribution approach. Extensive evaluations on four benchmarks show that TokenShapley outperforms state-of-the-art baselines in token-level attribution, achieving an 11-23% improvement in accuracy.

Yingtai Xiao, Yuqing Zhu, Sirat Samyoun, Wanrong Zhang, Jiachen T. Wang, Jian Du• 2025

Related benchmarks

TaskDatasetResultRank
AttributionVerifiability-Granular (test)
Attribution Accuracy84.77
28
AttributionQuoteSum (test)
Accuracy92.51
18
AttributionKV Retrieval (test)
Accuracy1
9
Context AttributionCNN Dailymail (1000 examples)
Log Probability Drop1.33
9
Showing 4 of 4 rows

Other info

Follow for update