Erdos Goes Neural: an Unsupervised Learning Framework for Combinatorial Optimization on Graphs
About
Combinatorial optimization problems are notoriously challenging for neural networks, especially in the absence of labeled instances. This work proposes an unsupervised learning framework for CO problems on graphs that can provide integral solutions of certified quality. Inspired by Erdos' probabilistic method, we use a neural network to parametrize a probability distribution over sets. Crucially, we show that when the network is optimized w.r.t. a suitably chosen loss, the learned distribution contains, with controlled probability, a low-cost integral solution that obeys the constraints of the combinatorial problem. The probabilistic proof of existence is then derandomized to decode the desired solutions. We demonstrate the efficacy of this approach to obtain valid solutions to the maximum clique problem and to perform local graph clustering. Our method achieves competitive results on both real datasets and synthetic hard instances.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Maximum Clique | RB200 MC instances (static) | Mean ApR99.455 | 38 | |
| Maximum Clique | Twitter MC instances (static) | Mean ApR0.9915 | 38 | |
| Maximum Clique | RB500 MC instances (static) | Mean ApR98.121 | 36 | |
| Maximum Clique | COLLAB | Mean ApR0.9997 | 30 | |
| Minimum Vertex Cover | RB200 (test) | Approximation Ratio1.0098 | 24 | |
| Minimum Vertex Cover | COLLAB (test) | AR*1.0002 | 16 | |
| Maximum Independent Set | SPECIAL (test) | Approximation Ratio0.921 | 13 | |
| Maximum Independent Set | Twitter (test) | Approximation Ratio0.935 | 13 | |
| Minimum Vertex Cover | RB500 (test) | Approximation Ratio1.021 | 13 | |
| Minimum Vertex Cover | Twitter (test) | AR*1.0041 | 12 |