Set2Graph: Learning Graphs From Sets
About
Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions. Examples include clustering, learning vertex and edge features on graphs, and learning features on triplets in a collection. A natural approach for building Set2Graph models is to characterize all linear equivariant set-to-hypergraph layers and stack them with non-linear activations. This poses two challenges: (i) the expressive power of these networks is not well understood; and (ii) these models would suffer from high, often intractable computational and memory complexity, as their dimension grows exponentially. This paper advocates a family of neural network models for learning Set2Graph functions that is both practical and of maximal expressive power (universal), that is, can approximate arbitrary continuous Set2Graph functions over compact sets. Testing these models on different machine learning tasks, mainly an application to particle physics, we find them favorable to existing baselines.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Set-to-graph prediction | Jets (B) | RI65.5 | 9 | |
| Set-to-graph prediction | Jets (C) | RI75.1 | 9 | |
| Set-to-graph prediction | Jets (L) | RI97.3 | 9 | |
| Set-to-graph prediction | Delaunay 50 points | Accuracy98.4 | 9 | |
| Set-to-graph prediction | Delaunay 20-80 points | Accuracy94.7 | 9 | |
| Hyperedge prediction | GPS (test) | AUC94.3 | 7 | |
| Hyperedge prediction | MovieLens (test) | AUC0.918 | 7 | |
| Hyperedge prediction | Drug (test) | AUC0.963 | 7 |