Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Approximation to Analytic and $L^p$ functions by Height-Augmented ReLU Networks

About

This work addresses two fundamental limitations in neural network approximation theory. We demonstrate that a three-dimensional network architecture enables a significantly more efficient representation of sawtooth functions, which serves as the cornerstone in the approximation of analytic and $L^p$ functions. First, we establish substantially improved exponential approximation rates for several important classes of analytic functions and offer a parameter-efficient network design. Second, for the first time, we derive a quantitative and non-asymptotic approximation of high orders for general $L^p$ functions. Our techniques advance the theoretical understanding of the neural network approximation in fundamental function spaces and offer a theoretically grounded pathway for designing more parameter-efficient networks.

ZeYu Li, FengLei Fan, TieYong Zeng• 2026

Related benchmarks

TaskDatasetResultRank
Function ApproximationAnalytic functions on [0, 1 - δ]^d
Approximation Error1
2
Function ApproximationAnalytic functions on L²(ℝ^d, γ_d), holomorphic in a strip
Error Bound1
2
Function ApproximationPolynomial functions on [0, 1]
Error2
1
Function ApproximationLᵖ functions on [−1, 1]^d
Error1
1
Function ApproximationAnalytic functions on [0, 1]^d, holomorphic in an ellipse--
1
Showing 5 of 5 rows

Other info

Follow for update