Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Rethinking Test-Time Training: Tilting The Latent Distribution For Few-Shot Source-Free Adaptation

About

Often, constraints arise in deployment settings where even lightweight parameter updates e.g. parameter-efficient fine-tuning could induce model shift or tuning instability. We study test-time adaptation of foundation models for few-shot classification under a completely frozen-model regime, where additionally, no upstream data are accessible. We propose arguably the first training-free inference method that adapts predictions to the new task by performing a change of measure over the latent embedding distribution induced by the encoder. Using task-similarity scores derived from a small labeled support set, exponential tilting reweights latent distributions in a KL-optimal manner without modifying model parameters. Empirically, the method consistently competes with parameter-update-based methods across multiple benchmarks and shot regimes, while operating under strictly and universally stronger constraints. These results demonstrate the viability of inference-level distributional correction for test-time adaptation even with a fully-frozen model pipeline.

Tahir Qasim Syed, Behraj Khan• 2026

Related benchmarks

TaskDatasetResultRank
Few-shot classificationCIFAR FS (test)--
51
Showing 1 of 1 rows

Other info

Follow for update