Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Late-to-Early Training: LET LLMs Learn Earlier, So Faster and Better

About

As Large Language Models (LLMs) achieve remarkable empirical success through scaling model and data size, pretraining has become increasingly critical yet computationally prohibitive, hindering rapid development. Despite the availability of numerous pretrained LLMs developed at significant computational expense, a fundamental real-world question remains underexplored: \textit{Can we leverage existing small pretrained models to accelerate the training of larger models?} In this paper, we propose a Late-to-Early Training (LET) paradigm that enables LLMs to explicitly learn later knowledge in earlier steps and earlier layers. The core idea is to guide the early layers of an LLM during early training using representations from the late layers of a pretrained (i.e. late training phase) model. We identify two key mechanisms that drive LET's effectiveness: late-to-early-step learning and late-to-early-layer learning. These mechanisms significantly accelerate training convergence while robustly enhancing both language modeling capabilities and downstream task performance, enabling faster training with superior performance. Extensive experiments on 1.4B and 7B parameter models demonstrate LET's efficiency and effectiveness. Notably, when training a 1.4B LLM on the Pile dataset, our method achieves up to 1.6$\times$ speedup with nearly 5\% improvement in downstream task accuracy compared to standard training, even when using a pretrained model with 10$\times$ fewer parameters than the target model.

Ji Zhao, Yufei Gu, Shitong Shao, Xun Zhou, Liang Xiang, Zeke Xie• 2026

Related benchmarks

TaskDatasetResultRank
Time-series classificationSelfRegulationSCP2
Accuracy53.9
55
Time-series classificationHeartbeat
Accuracy75
51
Time-series classificationUWaveGestureLibrary
Accuracy82.2
47
Time-series classificationSelfRegulationSCP1
Accuracy85.5
45
Time-series classificationPEMS-SF
Accuracy62.5
45
Time-series classificationFaceDetection
Accuracy66.9
34
Time-series classificationSpokenArabicDigits
Accuracy99.7
28
Time-series classificationHandwriting
Accuracy33.2
28
Time-series classificationJapaneseVowels
Accuracy95.7
28
Time-series classificationEthanolConcentration
Accuracy28.8
28
Showing 10 of 10 rows

Other info

Follow for update