Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Reversing Large Language Models for Efficient Training and Fine-Tuning

About

Large Language Models (LLMs) are known for their expensive and time-consuming training. Thus, oftentimes, LLMs are fine-tuned to address a specific task, given the pretrained weights of a pre-trained LLM considered a foundation model. In this work, we introduce memory-efficient, reversible architectures for LLMs, inspired by symmetric and symplectic differential equations, and investigate their theoretical properties. Different from standard, baseline architectures that store all intermediate activations, the proposed models use time-reversible dynamics to retrieve hidden states during backpropagation, relieving the need to store activations. This property allows for a drastic reduction in memory consumption, allowing for the processing of larger batch sizes for the same available memory, thereby offering improved throughput. In addition, we propose an efficient method for converting existing, non-reversible LLMs into reversible architectures through fine-tuning, rendering our approach practical for exploiting existing pre-trained models. Our results show comparable or improved performance on several datasets and benchmarks, on several LLMs, building a scalable and efficient path towards reducing the memory and computational costs associated with both training from scratch and fine-tuning of LLMs.

Eshed Gal, Moshe Eliasof, Javier Turek, Uri Ascher, Eran Treister, Eldad Haber• 2025

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningWinoGrande
Accuracy51.85
776
Commonsense ReasoningPIQA
Accuracy70.24
647
Question AnsweringOBQA
Accuracy24.4
276
Question AnsweringARC-E
Accuracy28.95
242
Question AnsweringARC-C
Accuracy28.09
166
Commonsense ReasoningARC Challenge
Accuracy26.75
132
Language ModelingOpenWebText (val)
Validation Loss2.6091
70
Common Sense ReasoningARC Easy
ARC (easy) Accuracy50.35
52
Commonsense ReasoningOpenBookQA
Accuracy35.6
41
Language ModelingOpenWebText (train)
Train Loss2.5243
11
Showing 10 of 10 rows

Other info

Follow for update