Neural Organ Transplantation (NOT): Checkpoint-Based Modular Adaptation for Transformer Models
About
We introduce Neural Organ Transplantation (NOT), a modular adaptation framework that enables trained transformer layers to function as reusable transferable checkpoints for domain adaptation. Unlike conventional fine-tuning approaches that tightly couple trained parameters to specific model instances and training data, NOT extracts contiguous layer subsets ("donor organs") from pre-trained models, trains them independently on domain-specific data, and saves them as standalone checkpoint files that can be transplanted into compatible recipient models without access to the original training data. Through experiments on three decoder-only transformer architectures spanning 124M to 20B parameters (GPT-2, TinyLlama, and GPT-OSS), we demonstrate that donor transplantation substantially outperforms existing adaptation methods, achieving an order-of-magnitude improvement in perplexity over LoRA while training significantly faster. The method exhibits position dependence, with early insertion positions yielding optimal results. Cross-domain transfer at billion-parameter scale reveals unexpected regularization benefits. These findings demonstrate that transformer middle layers can support efficient modular transfer for decoder-only architectures, enabling privacy-preserving expertise sharing through checkpoint distribution. We note that this approach is currently limited to decoder-only models; preliminary experiments on encoder-based architectures show reduced effectiveness.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Language Modeling | Wikitext (test) | Perplexity34.56 | 52 | |
| Language Modeling | GPT-2 124M held-out (test) | Perplexity17.33 | 10 | |
| Language Modeling | TinyLlama 1.1B | PPL54.15 | 6 | |
| Language Modeling | TinyLlama 1.1B (test) | Perplexity54.15 | 6 | |
| Language Modeling | GPT-OSS 20B held-out (test) | Perplexity34.56 | 5 | |
| Language Modeling | GPT-2 1,000 samples | PPL27.99 | 4 |