Deep Legendre Transform
About
We introduce a novel deep learning algorithm for computing convex conjugates of differentiable convex functions, a fundamental operation in convex analysis with various applications in different fields such as optimization, control theory, physics and economics. While traditional numerical methods suffer from the curse of dimensionality and become computationally intractable in high dimensions, more recent neural network--based approaches scale better, but have mostly been studied with the aim of solving optimal transport problems and require the solution of complicated optimization or max--min problems. Using an implicit Fenchel formulation of convex conjugation, our approach facilitates an efficient gradient--based framework for the minimization of approximation errors and, as a byproduct, also provides a posteriori estimates of the approximation accuracy. Numerical experiments demonstrate our method's ability to deliver accurate results across different high-dimensional examples. Moreover, by employing symbolic regression with Kolmogorov--Arnold networks, it is able to obtain the exact convex conjugates of specific convex functions.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Solving Hamilton-Jacobi equations | Hamilton-Jacobi equation with quadratic initial condition and Hamiltonian | L2 Error0.0027 | 40 | |
| PDE solving | Hamilton-Jacobi equation quadratic initial condition, exponential Hamiltonian | L2 Error0.0688 | 16 | |
| PDE approximation | Hamilton-Jacobi equations d=10 v1 (test) | L2 Error0.0272 | 8 | |
| PDE approximation | Hamilton-Jacobi equations d=30 v1 (test) | L2 Error0.128 | 8 | |
| Function Approximation | Negative Logarithm (Neg. Log) (test) | Solve Time (s)16.97 | 7 | |
| Function Approximation | Negative Entropy (test) | Tsolve (s)17.13 | 7 |