Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Towards Gradient-based Bilevel Optimization with Non-convex Followers and Beyond

About

In recent years, Bi-Level Optimization (BLO) techniques have received extensive attentions from both learning and vision communities. A variety of BLO models in complex and practical tasks are of non-convex follower structure in nature (a.k.a., without Lower-Level Convexity, LLC for short). However, this challenging class of BLOs is lack of developments on both efficient solution strategies and solid theoretical guarantees. In this work, we propose a new algorithmic framework, named Initialization Auxiliary and Pessimistic Trajectory Truncated Gradient Method (IAPTT-GM), to partially address the above issues. In particular, by introducing an auxiliary as initialization to guide the optimization dynamics and designing a pessimistic trajectory truncation operation, we construct a reliable approximate version of the original BLO in the absence of LLC hypothesis. Our theoretical investigations establish the convergence of solutions returned by IAPTT-GM towards those of the original BLO without LLC. As an additional bonus, we also theoretically justify the quality of our IAPTT-GM embedded with Nesterov's accelerated dynamics under LLC. The experimental results confirm both the convergence of our algorithm without LLC, and the theoretical findings under LLC.

Risheng Liu, Yaohua Liu, Shangzhi Zeng, Jin Zhang• 2021

Related benchmarks

TaskDatasetResultRank
Hyper-data CleaningMNIST (test)
Test Accuracy0.8947
31
Showing 1 of 1 rows

Other info

Follow for update