Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Erase then Rectify: A Training-Free Parameter Editing Approach for Cost-Effective Graph Unlearning

About

Graph unlearning, which aims to eliminate the influence of specific nodes, edges, or attributes from a trained Graph Neural Network (GNN), is essential in applications where privacy, bias, or data obsolescence is a concern. However, existing graph unlearning techniques often necessitate additional training on the remaining data, leading to significant computational costs, particularly with large-scale graphs. To address these challenges, we propose a two-stage training-free approach, Erase then Rectify (ETR), designed for efficient and scalable graph unlearning while preserving the model utility. Specifically, we first build a theoretical foundation showing that masking parameters critical for unlearned samples enables effective unlearning. Building on this insight, the Erase stage strategically edits model parameters to eliminate the impact of unlearned samples and their propagated influence on intercorrelated nodes. To further ensure the GNN's utility, the Rectify stage devises a gradient approximation method to estimate the model's gradient on the remaining dataset, which is then used to enhance model performance. Overall, ETR achieves graph unlearning without additional training or full training data access, significantly reducing computational overhead and preserving data privacy. Extensive experiments on seven public datasets demonstrate the consistent superiority of ETR in model utility, unlearning efficiency, and unlearning effectiveness, establishing it as a promising solution for real-world graph unlearning challenges.

Zhe-Rui Yang, Jindong Han, Chang-Dong Wang, Hao Liu• 2024

Related benchmarks

TaskDatasetResultRank
Edge UnlearningPhoto hard
ToU60.82
26
Edge UnlearningChameleon (hard)
Trade-off of Unlearning (ToU)59.47
25
Node unlearningCora
Average Runtime (s)0.02
20
Node unlearningPubmed
Runtime (s)0.02
20
Node unlearningCS
Average Unlearning Runtime (s)0.02
20
Node unlearningPhysics
Runtime (s)0.03
20
Node unlearningarXiv
Average Runtime (s)0.03
20
Node unlearningChameleon
Average Runtime (s)0.02
20
Node unlearningSquirrel
Average Runtime (s)0.01
20
Node unlearningCiteseer
Average Runtime (s)0.02
20
Showing 10 of 14 rows

Other info

Follow for update