Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Accelerating optimization over the space of probability measures

About

The acceleration of gradient-based optimization methods is a subject of significant practical and theoretical importance, particularly within machine learning applications. While much attention has been directed towards optimizing within Euclidean space, the need to optimize over spaces of probability measures in machine learning motivates exploration of accelerated gradient methods in this context too. To this end, we introduce a Hamiltonian-flow approach analogous to momentum-based approaches in Euclidean space. We demonstrate that, in the continuous-time setting, algorithms based on this approach can achieve convergence rates of arbitrarily high order. We complement our findings with numerical examples.

Shi Chen, Qin Li, Oliver Tse, Stephen J. Wright• 2023

Related benchmarks

TaskDatasetResultRank
Non-rigid Surface RegistrationLiver
ASSD (mm)0.749
6
Non-rigid Surface RegistrationPancreas
ASSD (mm)0.416
6
Non-rigid Surface RegistrationLeft ventricle
ASSD (mm)0.407
6
Affine Surface RegistrationLiver
ASSD (mm)7.555
5
Affine Surface RegistrationPancreas
ASSD (mm)5.511
5
Affine Surface RegistrationLeft ventricle
ASSD (mm)2.172
5
Affine Surface RegistrationLiver
Runtime (s)2.53
4
Affine Surface RegistrationPancreas
Runtime (s)2.82
4
Affine Surface RegistrationLeft ventricle
Runtime (second)1.6
4
Non-rigid Surface RegistrationLiver
Runtime (s)2.53
4
Showing 10 of 12 rows

Other info

Follow for update