Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Towards Efficient Post-Training via Fourier-Driven Adapter Architectures

About

We propose a novel framework, termed Fourier-Activated Adapter (FAA), for parameter-efficient fine-tuning of large pre-trained language models. By incorporating random Fourier features into lightweight adapter modules, FAA decomposes intermediate representations into complementary low- and high-frequency components, enabling frequency-aware modulation of semantic information. This design allows the model to selectively emphasize informative frequency bands during adaptation while preserving the representational capacity of the frozen backbone. Extensive experiments on GLUE, E2E NLG, and instruction-tuning benchmarks demonstrate that FAA consistently achieves competitive or superior performance compared to existing parameter-efficient fine-tuning methods, while maintaining low computational and memory overhead. Ablation studies further verify the effectiveness of frequency-aware activation and adaptive weighting mechanisms, highlighting FAA as a robust and efficient approach for post-training large language models.

Donggyun Bae, Jongil Park• 2025

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE
SST-296.1
452
Natural language generationE2E (test)
ROUGE-L89.94
79
DialogueMT-Bench (test)
GPT-4 Score8.35
46
Logical reasoningBBH (test)
Top@1 Accuracy88.29
27
Conversational AbilityVicuna Eval (test)
Vicuna Eval GPT-4 Score8.91
20
Mathematical ReasoningMATH (test)
MATH Accuracy84.27
20
Conversational AbilityAlpaca (test)
Alpaca LC Win Rate71.83
20
Showing 7 of 7 rows

Other info

Follow for update