Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Radial M\"untz-Sz\'asz Networks: Neural Architectures with Learnable Power Bases for Multidimensional Singularities

About

Radial singular fields, such as $1/r$, $\log r$, and crack-tip profiles, are difficult to model for coordinate-separable neural architectures. We show that any $C^2$ function that is both radial and additively separable must be quadratic, establishing a fundamental obstruction for coordinate-wise power-law models. Motivated by this result, we introduce Radial M\"untz-Sz\'asz Networks (RMN), which represent fields as linear combinations of learnable radial powers $r^\mu$, including negative exponents, together with a limit-stable log-primitive for exact $\log r$ behavior. RMN admits closed-form spatial gradients and Laplacians, enabling physics-informed learning on punctured domains. Across ten 2D and 3D benchmarks, RMN achieves 1.5$\times$--51$\times$ lower RMSE than MLPs and 10$\times$--100$\times$ lower RMSE than SIREN while using 27 parameters, compared with 33,537 for MLPs and 8,577 for SIREN. We extend RMN to angular dependence (RMN-Angular) and to multiple sources with learnable centers (RMN-MC); when optimization converges, source-center recovery errors fall below $10^{-4}$. We also report controlled failures on smooth, strongly non-radial targets to delineate RMN's operating regime.

Gnankan Landry Regis N'guessan, Bum Jun Kim• 2026

Related benchmarks

TaskDatasetResultRank
Function Approximation2D smooth
RMSE4.781
5
Solving 3D Poisson equation3D Poisson (val)
Best Rel L2 Error8.8
4
Function Approximation2D log r--
4
Function Approximation2D r^1/2--
4
Function Approximation2D r^-1--
4
Function Approximation2D multi-power--
4
Function Approximation3D Coulomb--
4
Function Approximation2D crack-tip--
2
Function Approximation2D 2-source--
2
Function Approximation2D 3-source--
2
Showing 10 of 10 rows

Other info

Follow for update