Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Efficient Bilevel Optimization with KFAC-Based Hypergradients

About

Bilevel optimization (BO) is widely applicable to many machine learning problems. Scaling BO, however, requires repeatedly computing hypergradients, which involves solving inverse Hessian-vector products (IHVPs). In practice, these operations are often approximated using crude surrogates such as one-step gradient unrolling or identity/short Neumann expansions, which discard curvature information. We build on implicit function theorem-based algorithms and propose to incorporate Kronecker-factored approximate curvature (KFAC), yielding curvature-aware hypergradients with a better performance efficiency trade-off than Conjugate Gradient (CG) or Neumann methods and consistently outperforming unrolling. We evaluate this approach across diverse tasks, including meta-learning and AI safety problems. On models up to BERT, we show that curvature information is valuable at scale, and KFAC can provide it with only modest memory and runtime overhead. Our implementation is available at https://github.com/liaodisen/NeuralBo.

Disen Liao, Felix Dangel, Yaoliang Yu• 2026

Related benchmarks

TaskDatasetResultRank
Data Poisoning DefenseCIFAR-10 (test)
Test Accuracy64.3
76
Image ClassificationCIFAR-10 Long-Tailed
Accuracy88.4
59
Image ClassificationCIFAR-100 Long-Tailed
Accuracy60.8
59
Data PoisoningMNIST (test)
Clean Accuracy98.02
8
Sample UnlearningCIFAR-10 (test)
Accuracy (ResNet18)42.33
6
Continued PretrainingChemProt (test)
Micro F185.04
4
Continued PretrainingACL-ARC (test)
Macro F172.98
4
Continued PretrainingSCIERC (test)
Macro F181.22
4
Continued PretrainingHyperPartisan (test)
Macro-F195.13
4
Showing 9 of 9 rows

Other info

Follow for update