Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Softly Symbolifying Kolmogorov-Arnold Networks

About

Kolmogorov-Arnold Networks (KANs) offer a promising path toward interpretable machine learning: their learnable activations can be studied individually, while collectively fitting complex data accurately. In practice, however, trained activations often lack symbolic fidelity, learning pathological decompositions with no meaningful correspondence to interpretable forms. We propose Softly Symbolified Kolmogorov-Arnold Networks (S2KAN), which integrate symbolic primitives directly into training. Each activation draws from a dictionary of symbolic and dense terms, with learnable gates that sparsify the representation. Crucially, this sparsification is differentiable, enabling end-to-end optimization, and is guided by a principled Minimum Description Length objective. When symbolic terms suffice, S2KAN discovers interpretable forms; when they do not, it gracefully degrades to dense splines. We demonstrate competitive or superior accuracy with substantially smaller models across symbolic benchmarks, dynamical systems forecasting, and real-world prediction tasks, and observe evidence of emergent self-sparsification even without regularization pressure.

James Bagrow, Josh Bongard• 2025

Related benchmarks

TaskDatasetResultRank
Symbolic RegressionNguyen-F7
R^21
18
Symbolic RegressionNguyen-F1
R^21
18
Symbolic RegressionNguyen-F2
R^21
18
Symbolic RegressionNguyen F3
R^21
18
Symbolic RegressionNguyen-F4
R^21
18
Symbolic RegressionNguyen-F6
R^21
18
Symbolic RegressionNguyen-F10
R^20.9999
18
Symbolic RegressionNguyen-F8
R^20.9998
18
Symbolic RegressionNguyen-F9
R^20.9999
18
Symbolic RegressionNguyen-F5
R^20.9999
18
Showing 10 of 14 rows

Other info

Follow for update