Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GIST: Targeted Data Selection for Instruction Tuning via Coupled Optimization Geometry

About

Targeted data selection has emerged as a crucial paradigm for efficient instruction tuning, aiming to identify a small yet influential subset of training examples for a specific target task. In practice, influence is often measured through the effect of an example on parameter updates. To make selection scalable, many approaches leverage optimizer statistics (e.g., Adam states) as an axis-aligned surrogate for update geometry (i.e., diagonal precondition), implicitly treating parameters as coordinate-wise independent. We show that this assumption breaks down in parameter-efficient fine-tuning (PEFT) methods such as LoRA. In this setting, the induced optimization geometry exhibits strong cross-parameter coupling with non-trivial off-diagonal interactions, while the task-relevant update directions are confined to a low-dimensional subspace. Motivated by this mismatch, we propose GIST (Gradient Isometric Subspace Transformation), a simple yet principled alternative that replaces axis-aligned scaling with robust subspace alignment. GIST recovers a task-specific subspace from validation gradients via spectral filtering (SVD), projects training gradients into this coupled subspace, and scores examples by their alignment with target directions.Extensive experiments have demonstrated that GIST matches or outperforms the state-of-the-art baseline with only 0.29% of the storage and 25% of the computational time under the same selection budget.

Guanghui Min, Tianhao Huang, Ke Wan, Chen Chen• 2026

Related benchmarks

TaskDatasetResultRank
Language UnderstandingMMLU
Accuracy62.9
756
ReasoningBBH
Accuracy48
507
Multilingual Question AnsweringTyDiQA
Accuracy69.2
44
Instruction Tuning Data Selection EfficiencyInstruction Tuning Datasets 270k examples (train)
Warmup Time (h)1.5
2
Showing 4 of 4 rows

Other info

Follow for update