Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Organ Transplantation (NOT): Checkpoint-Based Modular Adaptation for Transformer Models

About

We introduce Neural Organ Transplantation (NOT), a modular adaptation framework that enables trained transformer layers to function as reusable transferable checkpoints for domain adaptation. Unlike conventional fine-tuning approaches that tightly couple trained parameters to specific model instances and training data, NOT extracts contiguous layer subsets ("donor organs") from pre-trained models, trains them independently on domain-specific data, and saves them as standalone checkpoint files that can be transplanted into compatible recipient models without access to the original training data. Through experiments on three decoder-only transformer architectures spanning 124M to 20B parameters (GPT-2, TinyLlama, and GPT-OSS), we demonstrate that donor transplantation substantially outperforms existing adaptation methods, achieving an order-of-magnitude improvement in perplexity over LoRA while training significantly faster. The method exhibits position dependence, with early insertion positions yielding optimal results. Cross-domain transfer at billion-parameter scale reveals unexpected regularization benefits. These findings demonstrate that transformer middle layers can support efficient modular transfer for decoder-only architectures, enabling privacy-preserving expertise sharing through checkpoint distribution. We note that this approach is currently limited to decoder-only models; preliminary experiments on encoder-based architectures show reduced effectiveness.

Ahmad Al-Zuraiqi• 2026

Related benchmarks

TaskDatasetResultRank
Language ModelingWikitext (test)
Perplexity34.56
52
Language ModelingGPT-2 124M held-out (test)
Perplexity17.33
10
Language ModelingTinyLlama 1.1B
PPL54.15
6
Language ModelingTinyLlama 1.1B (test)
Perplexity54.15
6
Language ModelingGPT-OSS 20B held-out (test)
Perplexity34.56
5
Language ModelingGPT-2 1,000 samples
PPL27.99
4
Showing 6 of 6 rows

Other info

Follow for update