Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Attention Frequency Modulation: Training-Free Spectral Modulation of Diffusion Cross-Attention

About

Cross-attention is the primary interface through which text conditions latent diffusion models, yet its step-wise multi-resolution dynamics remain under-characterized, limiting principled training-free control. We cast diffusion cross-attention as a spatiotemporal signal on the latent grid by summarizing token-softmax weights into token-agnostic concentration maps and tracking their radially binned Fourier power over denoising. Across prompts and seeds, encoder cross-attention exhibits a consistent coarse-to-fine spectral progression, yielding a stable time-frequency fingerprint of token competition. Building on this structure, we introduce Attention Frequency Modulation (AFM), a plug-and-play inference-time intervention that edits token-wise pre-softmax cross-attention logits in the Fourier domain: low- and high-frequency bands are reweighted with a progress-aligned schedule and can be adaptively gated by token-allocation entropy, before the token softmax. AFM provides a continuous handle to bias the spatial scale of token-competition patterns without retraining, prompt editing, or parameter updates. Experiments on Stable Diffusion show that AFM reliably redistributes attention spectra and produces substantial visual edits while largely preserving semantic alignment. Finally, we find that entropy mainly acts as an adaptive gain on the same frequency-based edit rather than an independent control axis.

Seunghun Oh, Unsang Park• 2026

Related benchmarks

TaskDatasetResultRank
Text-to-Image AlignmentMS-COCO
CLIP Score0.319
20
Paired Perceptual DeviationCOCO
LPIPS0.00e+0
10
Paired Perceptual DeviationLAION
LPIPS0.00e+0
10
Text-image alignmentLAION
CLIP Cosine Similarity0.306
10
Showing 4 of 4 rows

Other info

Follow for update