Faster Accelerated First-order Methods for Convex Optimization with Strongly Convex Function Constraints
About
In this paper, we introduce faster accelerated primal-dual algorithms for minimizing a convex function subject to strongly convex function constraints. Prior to our work, the best complexity bound was $\mathcal{O}(1/{\varepsilon})$, regardless of the strong convexity of the constraint function. It is unclear whether the strong convexity assumption can enable even better convergence results. To address this issue, we have developed novel techniques to progressively estimate the strong convexity of the Lagrangian function. Our approach, for the first time, effectively leverages the constraint strong convexity, obtaining an improved complexity of $\mathcal{O}(1/\sqrt{\varepsilon})$. This rate matches the complexity lower bound for strongly-convex-concave saddle point optimization and is therefore order-optimal. We show the superior performance of our methods in sparsity-inducing constrained optimization, notably Google's personalized PageRank problem. Furthermore, we show that a restarted version of the proposed methods can effectively identify the optimal solution's sparsity pattern within a finite number of steps, a result that appears to have independent significance.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Personalized PageRank | bio-CE-HT | Execution Time77.21 | 6 | |
| Personalized PageRank | bio-CE-LC | Execution Time0.44 | 6 | |
| Personalized PageRank | econ-beaflw | Execution Time18.42 | 6 | |
| Personalized PageRank | DD242 | Execution Time6.3 | 6 | |
| Personalized PageRank | DD68 | Time15.69 | 6 | |
| Personalized PageRank | peking-1 | Execution Time4.86 | 6 | |
| Computational Optimization | Small Optimization Problem Instances | Time (s)24.612 | 5 |