Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Deep Legendre Transform

About

We introduce a novel deep learning algorithm for computing convex conjugates of differentiable convex functions, a fundamental operation in convex analysis with various applications in different fields such as optimization, control theory, physics and economics. While traditional numerical methods suffer from the curse of dimensionality and become computationally intractable in high dimensions, more recent neural network--based approaches scale better, but have mostly been studied with the aim of solving optimal transport problems and require the solution of complicated optimization or max--min problems. Using an implicit Fenchel formulation of convex conjugation, our approach facilitates an efficient gradient--based framework for the minimization of approximation errors and, as a byproduct, also provides a posteriori estimates of the approximation accuracy. Numerical experiments demonstrate our method's ability to deliver accurate results across different high-dimensional examples. Moreover, by employing symbolic regression with Kolmogorov--Arnold networks, it is able to obtain the exact convex conjugates of specific convex functions.

Aleksey Minabutdinov, Patrick Cheridito• 2025

Related benchmarks

TaskDatasetResultRank
Solving Hamilton-Jacobi equationsHamilton-Jacobi equation with quadratic initial condition and Hamiltonian
L2 Error0.0027
40
PDE solvingHamilton-Jacobi equation quadratic initial condition, exponential Hamiltonian
L2 Error0.0688
16
PDE approximationHamilton-Jacobi equations d=10 v1 (test)
L2 Error0.0272
8
PDE approximationHamilton-Jacobi equations d=30 v1 (test)
L2 Error0.128
8
Function ApproximationNegative Logarithm (Neg. Log) (test)
Solve Time (s)16.97
7
Function ApproximationNegative Entropy (test)
Tsolve (s)17.13
7
Showing 6 of 6 rows

Other info

Follow for update