Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Contrastive Losses and Solution Caching for Predict-and-Optimize

About

Many decision-making processes involve solving a combinatorial optimization problem with uncertain input that can be estimated from historic data. Recently, problems in this class have been successfully addressed via end-to-end learning approaches, which rely on solving one optimization problem for each training instance at every epoch. In this context, we provide two distinct contributions. First, we use a Noise Contrastive approach to motivate a family of surrogate loss functions, based on viewing non-optimal solutions as negative examples. Second, we address a major bottleneck of all predict-and-optimize approaches, i.e. the need to frequently recompute optimal solutions at training time. This is done via a solver-agnostic solution caching scheme, and by replacing optimization calls with a lookup in the solution cache. The method is formally based on an inner approximation of the feasible space and, combined with a cache lookup strategy, provides a controllable trade-off between training time and accuracy of the loss approximation. We empirically show that even a very slow growth rate is enough to match the quality of state-of-the-art methods, at a fraction of the computational cost.

Maxime Mulamba, Jayanta Mandi, Michelangelo Diligenti, Michele Lombardi, Victor Bucarey, Tias Guns• 2020

Related benchmarks

TaskDatasetResultRank
Shortest PathShortest Path Degree 2, 4, 6, 8 (test)
Average Relative Regret9.59
32
Knapsack ProblemKnapsack Degree 2, 4, 6, 8 (test)
Average Relative Regret20.07
32
Portfolio OptimizationPortfolio Degree 2, 4, 6, 8 (test)
Average Relative Regret7.81
32
Resource AllocationCOVID Resource Allocation (test)
Average Relative Regret16.48
8
Energy SchedulingEnergy (test)
Average Relative Regret1.59
8
Showing 5 of 5 rows

Other info

Follow for update