A Graph Meta-Network for Learning on Kolmogorov-Arnold Networks
About
Weight-space models learn directly from the parameters of neural networks, enabling tasks such as predicting their accuracy on new datasets. Naive methods -- like applying MLPs to flattened parameters -- perform poorly, making the design of better weight-space architectures a central challenge. While prior work leveraged permutation symmetries in standard networks to guide such designs, no analogous analysis or tailored architecture yet exists for Kolmogorov-Arnold Networks (KANs). In this work, we show that KANs share the same permutation symmetries as MLPs, and propose the KAN-graph, a graph representation of their computation. Building on this, we develop WS-KAN, the first weight-space architecture that learns on KANs, which naturally accounts for their symmetry. We analyze WS-KAN's expressive power, showing it can replicate an input KAN's forward pass - a standard approach for assessing expressiveness in weight-space architectures. We construct a comprehensive ``zoo'' of trained KANs spanning diverse tasks, which we use as benchmarks to empirically evaluate WS-KAN. Across all tasks, WS-KAN consistently outperforms structure-agnostic baselines, often by a substantial margin. Our code is available at https://github.com/BarSGuy/KAN-Graph-Metanetwork.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| INR classification | F-MNIST Implicit Neural Representations (test) | Accuracy84.6 | 15 | |
| INR classification | MNIST (test) | Accuracy94.3 | 7 | |
| INR classification | CIFAR-10 (test) | Accuracy42.2 | 7 | |
| Accuracy Prediction | F-MNIST (test) | MSE2.94 | 6 | |
| Accuracy Prediction | K-MNIST (test) | MSE1.45 | 6 | |
| Pruning mask prediction | MNIST (test) | Accuracy97.93 | 6 | |
| Pruning mask prediction | Fashion MNIST (test) | Accuracy98.93 | 6 | |
| Pruning mask prediction | Kuzushiji-MNIST (test) | Accuracy97.72 | 6 | |
| Regression | MNIST (test) | MSE3.29 | 6 |