Graph Neural Networks for Maximum Constraint Satisfaction
About
Many combinatorial optimization problems can be phrased in the language of constraint satisfaction problems. We introduce a graph neural network architecture for solving such optimization problems. The architecture is generic; it works for all binary constraint satisfaction problems. Training is unsupervised, and it is sufficient to train on relatively small instances; the resulting networks perform well on much larger instances (at least 10-times larger). We experimentally evaluate our approach for a variety of problems, including Maximum Cut and Maximum Independent Set. Despite being generic, we show that our approach matches or surpasses most greedy and semi-definite programming based algorithms and sometimes even outperforms state-of-the-art heuristics for the specific problems.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Maximum Clique | COLLAB | Mean ApR0.9916 | 38 | |
| Maximum Clique | Twitter MC instances (static) | Mean ApR0.917 | 38 | |
| Maximum Clique | RB200 MC instances (static) | Mean ApR78.6 | 38 | |
| Maximum Clique | RB500 MC instances (static) | Mean ApR79.3 | 36 | |
| MaxCut | Gset (test) | ApR1 | 33 | |
| MaxCut | GSET (|V|=800) | Average Gap to Best Known Cut185.9 | 9 | |
| MaxCut | GSET (|V|=1K) | Average Gap to Best Known Cut156.6 | 9 | |
| Maximum Clique | RB200 | Approximation Ratio0.8723 | 8 | |
| Minimum Vertex Cover | IMDB | Approximation Ratio1 | 8 | |
| Minimum Vertex Cover | Approximation Ratio1.0158 | 8 |