Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

LaTIM: Measuring Latent Token-to-Token Interactions in Mamba Models

About

State space models (SSMs), such as Mamba, have emerged as an efficient alternative to transformers for long-context sequence modeling. However, despite their growing adoption, SSMs lack the interpretability tools that have been crucial for understanding and improving attention-based architectures. While recent efforts provide insights into Mamba's internal mechanisms, they do not explicitly decompose token-wise contributions, leaving gaps in understanding how Mamba selectively processes sequences across layers. In this work, we introduce LaTIM, a novel token-level decomposition method for both Mamba-1 and Mamba-2 that enables fine-grained interpretability. We extensively evaluate our method across diverse tasks, including machine translation, copying, and retrieval-based generation, demonstrating its effectiveness in revealing Mamba's token-to-token interaction patterns.

Hugo Pitorro, Marcos Treviso• 2025

Related benchmarks

TaskDatasetResultRank
Word AlignmentRWTH Gold Alignment de-en (test)
AER0.44
31
Token AlignmentIWSLT DE→EN 2017 (test)
AER0.43
22
Token AlignmentIWSLT Fr-En 2017 (test)
AER35
22
CopyingCopying task
AUC98
11
Showing 4 of 4 rows

Other info

Code

Follow for update