Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

On the rate of convergence in Wasserstein distance of the empirical measure

About

Let $\mu_N$ be the empirical measure associated to a $N$-sample of a given probability distribution $\mu$ on $\mathbb{R}^d$. We are interested in the rate of convergence of $\mu_N$ to $\mu$, when measured in the Wasserstein distance of order $p>0$. We provide some satisfying non-asymptotic $L^p$-bounds and concentration inequalities, for any values of $p>0$ and $d\geq 1$. We extend also the non asymptotic $L^p$-bounds to stationary $\rho$-mixing sequences, Markov chains, and to some interacting particle systems.

Nicolas Fournier, Arnaud Guillin• 2013

Related benchmarks

TaskDatasetResultRank
Distribution-learningUniform distribution rho = 1
Wasserstein Distance Error0.023
48
Distribution-learningGaussian distribution rho = 2 (test)
Wasserstein Distance Error0.166
48
Distribution-learningGaussian distribution rho = 2
Wasserstein Distance Error0.166
48
Wasserstein distance error bound estimationEmissions real-world
Fournier Analysis (N)2.96
1
Wasserstein distance error bound estimationMiniboone
Fournier Distance Estimate (N)11.77
1
Wasserstein distance error bound estimationOCTMNIST
Fournier Error Bound58.87
1
Showing 6 of 6 rows

Other info

Follow for update