Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Decoder Scaling Strategy for Neural Routing Solvers

About

Construction-based neural routing solvers, typically composed of an encoder and a decoder, have emerged as a promising approach for solving vehicle routing problems. While recent studies suggest that shifting parameters from the encoder to the decoder enhances performance, most works restrict the decoder size to 1-3M parameters, leaving the effects of scaling largely unexplored. To address this gap, we conduct a systematic study comparing two distinct strategies: scaling depth versus scaling width. We synthesize these strategies to construct a suite of 12 model configurations, spanning a parameter range from 1M to ~150M, and extensively evaluate their scaling behaviors across three critical dimensions: parameter efficiency, data efficiency, and compute efficiency. Our empirical results reveal that parameter count is insufficient to accurately predict the model performance, highlighting the critical and distinct roles of model depth (layer count) and width (embedding dimension). Crucially, we demonstrate that scaling depth yields superior performance gains to scaling width. Based on these findings, we provide and experimentally validate a set of design principles for the efficient allocation of parameters and compute resources to enhance the model performance.

Qing Luo, Fu Luo, Ke Li, Zhenkun Wang• 2026

Related benchmarks

TaskDatasetResultRank
Traveling Salesman ProblemUniform-TSP1000
Optimality Gap0.493
18
Traveling Salesman ProblemUniform-TSP100
Optimality Gap1.3
14
Traveling Salesman ProblemTSPLIB
Optimality Gap1.288
14
Traveling Salesperson ProblemTSPLib < 500 nodes
Optimality Gap0.811
7
Showing 4 of 4 rows

Other info

Follow for update