Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SeedFlood: A Step Toward Scalable Decentralized Training of LLMs

About

This work presents a new approach to decentralized training-SeedFlood-designed to scale for large models across complex network topologies and achieve global consensus with minimal communication overhead. Traditional gossip-based methods suffer from message communication costs that grow with model size, while information decay over network hops renders global consensus inefficient. SeedFlood departs from these practices by exploiting the seed-reconstructible structure of zeroth-order updates and effectively making the messages near-zero in size, allowing them to be flooded to every client in the network. This mechanism makes communication overhead negligible and independent of model size, removing the primary scalability bottleneck in decentralized training. Consequently, SeedFlood enables training in regimes previously considered impractical, such as billion-parameter models distributed across hundreds of clients. Our experiments on decentralized LLM fine-tuning demonstrate thatSeedFlood consistently outperforms gossip-based baselines in both generalization performance and communication efficiency, and even achieves results comparable to first-order methods in large scale settings.

Jihun Kim, Namhoon Lee• 2026

Related benchmarks

TaskDatasetResultRank
Decentralized TrainingSST-2, RTE, and BoolQ OPT-125m backbone
Normalized Performance100.2
40
Natural Language UnderstandingGLUE and SuperGLUE (test val)
SST-292.89
37
Showing 2 of 2 rows

Other info

Follow for update