Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Convex Bi-Level Optimization Problems with Non-smooth Outer Objective Function

About

In this paper, we propose the Bi-Sub-Gradient (Bi-SG) method, which is a generalization of the classical sub-gradient method to the setting of convex bi-level optimization problems. This is a first-order method that is very easy to implement in the sense that it requires only a computation of the associated proximal mapping or a sub-gradient of the outer non-smooth objective function, in addition to a proximal gradient step on the inner optimization problem. We show, under very mild assumptions, that Bi-SG tackles bi-level optimization problems and achieves sub-linear rates both in terms of the inner and outer objective functions. Moreover, if the outer objective function is additionally strongly convex (still could be non-smooth), the outer rate can be improved to a linear rate. Last, we prove that the distance of the generated sequence to the set of optimal solutions of the bi-level problem converges to zero.

Roey Merchav, Shoham Sabach• 2023

Related benchmarks

TaskDatasetResultRank
Logistic Regression1,000 songs sample (train)
Lower-level Value0.3377
11
Least Squares Regression1,000 songs sample (train)
Lower-level Value0.0086
8
Showing 2 of 2 rows

Other info

Follow for update