Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TextResNet: Decoupling and Routing Optimization Signals in Compound AI Systems via Deep Residual Tuning

About

Textual Gradient-style optimizers (TextGrad) enable gradient-like feedback propagation through compound AI systems. However, they do not work well for deep chains. The root cause of this limitation stems from the Semantic Entanglement problem in these extended workflows. In standard textual backpropagation, feedback signals mix local critiques with upstream contexts, leading to Attribution Ambiguity. To address this challenge, we propose TextResNet, a framework that reformulates the optimization process to achieve precise signal routing via four key innovations. Firstly, in the forward pass, it enforces Additive Semantic Deltas to preserve an Identity Highway for gradient flow. Secondly, in the backward pass, it introduces Semantic Gradient Decomposition via a Semantic Projector to disentangle feedback into causally independent subspaces. Thirdly, it implements Causal Routing, which routes projected signals to their specific components. Finally, it performs Density-Aware Optimization Scheduling to leverage the disentangled signals to dynamically allocate resources to key system bottlenecks. Our results show that TextResNet not only achieves superior performance compared to TextGrad, but also exhibits remarkable stability for agentic tasks in compound AI systems where baselines collapse. Code is available at https://github.com/JeanDiable/TextResNet.

Suizhi Huang, Mei Li, Han Yu, Xiaoxiao Li• 2026

Related benchmarks

TaskDatasetResultRank
Multi-hop Question AnsweringHotpotQA (test)
F146.23
198
Medical Question AnsweringPubMedQA Reasoning Required
Accuracy60.31
10
Code GenerationBigCodeBench instruction split (test)
Pass Rate37.86
6
Semi-structured Retrieval and Query OptimizationSTARK-PRIME (test)
MRR41.75
6
Showing 4 of 4 rows

Other info

Follow for update