Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Achieving optimal complexity guarantees for a class of bilevel convex optimization problems

About

We design and analyze a novel accelerated gradient-based algorithm for a class of bilevel optimization problems. These problems have various applications arising from machine learning and image processing, where optimal solutions of the two levels are interdependent. That is, achieving the optimal solution of an upper-level problem depends on the solution set of a lower-level optimization problem. We significantly improve existing iteration complexity to $\mathcal{O}(\epsilon^{-0.5})$ for both suboptimality and infeasibility error metrics, where $\epsilon>0$ denotes an arbitrary scalar. In addition, contrary to existing methods that require solving the optimization problem sequentially (initially solving an optimization problem to approximate the solution of the lower-level problem followed by a second algorithm), our algorithm concurrently solves the optimization problem. To the best of our knowledge, the proposed algorithm has the fastest known iteration complexity, which matches the optimal complexity for single-level optimization. We conduct numerical experiments on sparse linear regression problems to demonstrate the efficacy of our approach.

Sepideh Samadi, Daniel Burbano, Farzad Yousefian• 2023

Related benchmarks

TaskDatasetResultRank
Logistic Regression1,000 songs sample (train)
Lower-level Value0.3279
11
Showing 1 of 1 rows

Other info

Follow for update