Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Initialization Schemes for Kolmogorov-Arnold Networks: An Empirical Study

About

Kolmogorov-Arnold Networks (KANs) are a recently introduced neural architecture that replace fixed nonlinearities with trainable activation functions, offering enhanced flexibility and interpretability. While KANs have been applied successfully across scientific and machine learning tasks, their initialization strategies remain largely unexplored. In this work, we study initialization schemes for spline-based KANs, proposing two theory-driven approaches inspired by LeCun and Glorot, as well as an empirical power-law family with tunable exponents. Our evaluation combines large-scale grid searches on function fitting and forward PDE benchmarks, an analysis of training dynamics through the lens of the Neural Tangent Kernel, and evaluations on a subset of the Feynman dataset. Our findings indicate that the Glorot-inspired initialization significantly outperforms the baseline in parameter-rich models, while power-law initialization achieves the strongest performance overall, both across tasks and for architectures of varying size. All code and data accompanying this manuscript are publicly available at https://github.com/srigas/KAN_Initialization_Schemes.

Spyros Rigas, Dhruv Verma, Georgios Alexandridis, Yixuan Wang• 2025

Related benchmarks

TaskDatasetResultRank
Function FittingFeynman
Loss1
60
Symbolic RegressionFeynman benchmark
Median Final Training Loss1
60
Showing 2 of 2 rows

Other info

Follow for update