Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Discovering Invariant Rationales for Graph Neural Networks

About

Intrinsic interpretability of graph neural networks (GNNs) is to find a small subset of the input graph's features -- rationale -- which guides the model prediction. Unfortunately, the leading rationalization models often rely on data biases, especially shortcut features, to compose rationales and make predictions without probing the critical and causal patterns. Moreover, such data biases easily change outside the training distribution. As a result, these models suffer from a huge drop in interpretability and predictive performance on out-of-distribution data. In this work, we propose a new strategy of discovering invariant rationale (DIR) to construct intrinsically interpretable GNNs. It conducts interventions on the training distribution to create multiple interventional distributions. Then it approaches the causal rationales that are invariant across different distributions while filtering out the spurious patterns that are unstable. Experiments on both synthetic and real-world datasets validate the superiority of our DIR in terms of interpretability and generalization ability on graph classification over the leading baselines. Code and datasets are available at https://github.com/Wuyxin/DIR-GNN.

Ying-Xin Wu, Xiang Wang, An Zhang, Xiangnan He, Tat-Seng Chua• 2022

Related benchmarks

TaskDatasetResultRank
Graph ClassificationMutag (test)
Accuracy89.5
217
Graph ClassificationMolHIV
ROC AUC58.08
82
Graph ClassificationCMNIST-75sp unbiased (test)
Accuracy10.38
60
Graph ClassificationCFashion-75sp unbiased (test)
Accuracy16.77
60
Graph ClassificationCKuzushiji-75sp unbiased (test)
Accuracy10.72
60
Graph ClassificationTwitter
Accuracy59.85
57
Graph ClassificationDrugOOD EC50 (OOD test)
ROC AUC65.81
52
Graph ClassificationDD (test)
Accuracy74.1
44
Molecular property predictionBACE
ROC-AUC76.77
35
Molecular property predictionBBBP
ROC AUC0.6813
35
Showing 10 of 131 rows
...

Other info

Follow for update