Differentiation of Blackbox Combinatorial Solvers
About
Achieving fusion of deep learning with combinatorial algorithms promises transformative changes to artificial intelligence. One possible approach is to introduce combinatorial building blocks into neural networks. Such end-to-end architectures have the potential to tackle combinatorial problems on raw input data such as ensuring global consistency in multi-object tracking or route planning on maps in robotics. In this work, we present a method that implements an efficient backward pass through blackbox implementations of combinatorial solvers with linear objective functions. We provide both theoretical and experimental backing. In particular, we incorporate the Gurobi MIP solver, Blossom V algorithm, and Dijkstra's algorithm into architectures that extract suitable features from raw inputs for the traveling salesman problem, the min-cost perfect matching problem and the shortest path problem. The code is available at https://github.com/martius-lab/blackbox-backprop.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Shortest Path Prediction | Warcraft II | Accuracy95.2 | 16 | |
| Set Matching | Set Matching SM1 (test) | Regret (%)91.53 | 10 | |
| Set Matching | Set Matching SM2 (test) | Regret (%)91.19 | 10 | |
| Set Matching | Set Matching (SM3) (test) | Regret (%)90.19 | 10 | |
| Synthetic Shortest Path | Synthetic Shortest Path SP3 (test) | Regret Rate10.15 | 10 | |
| Synthetic Shortest Path | Synthetic Shortest Path SP1 (test) | Regret (%)16.55 | 10 | |
| Synthetic Shortest Path | Synthetic Shortest Path SP2 (test) | Regret (%)11.92 | 10 |