TokenMixer-Large: Scaling Up Large Ranking Models in Industrial Recommenders
About
While scaling laws for recommendation models have gained significant traction, existing architectures such as Wukong, HiFormer and DHEN, often struggle with sub-optimal designs and hardware under-utilization, limiting their practical scalability. Our previous TokenMixer architecture (introduced in RankMixer paper) addressed effectiveness and efficiency by replacing self-attention with a ightweight token-mixing operator; however, it faced critical bottlenecks in deeper configurations, including sub-optimal residual paths, vanishing gradients, incomplete MoE sparsification and constrained scalability. In this paper, we propose TokenMixer-Large, a systematically evolved architecture designed for extreme-scale recommendation. By introducing a mixing-and-reverting operation, inter-layer residuals and the auxiliary loss, we ensure stable gradient propagation even as model depth increases. Furthermore, we incorporate a Sparse Per-token MoE to enable efficient parameter expansion. TokenMixer-Large successfully scales its parameters to 7-billion and 15-billion on online traffic and offline experiments, respectively. Currently deployed in multiple scenarios at ByteDance, TokenMixer-Large has achieved significant offline and online performance gains, delivering an increase of +1.66\% in orders and +2.98\% in per-capita preview payment GMV for e-commerce, improving ADSS by +2.0\% in advertising and achieving a +1.4\% revenue growth for live streaming.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| CTCVR Prediction | E-commerce Douyin | ΔAUC1.2 | 12 | |
| Ad Ranking | Douyin Feed Ads (online) | Delta AUC0.0035 | 1 | |
| E-commerce Ranking | Douyin E-Commerce (online) | Delta AUC0.51 | 1 | |
| Live Streaming Ranking | Douyin Live Streaming (online) | ΔAUC0.7 | 1 |