Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

From Domains to Instances: Dual-Granularity Data Synthesis for LLM Unlearning

About

Although machine unlearning is essential for removing private, harmful, or copyrighted content from LLMs, current benchmarks often fail to faithfully represent the true "forgetting scope" learned by the model. We formalize two distinct unlearning granularities, domain-level and instance-level, and propose BiForget, an automated framework for synthesizing high-quality forget sets. Unlike prior work relying on external generators, BiForget exploits the target model per se to elicit data that matches its internal knowledge distribution through seed-guided and adversarial prompting. Our experiments across diverse benchmarks show that it achieves a superior balance of relevance, diversity, and efficiency. Quantitatively, in the Harry Potter domain, it improves relevance by ${\sim}20$ and diversity by ${\sim}$0.05 while halving the total data size compared to SOTAs. Ultimately, it facilitates more robust forgetting and better utility preservation, providing a more rigorous foundation for evaluating LLM unlearning.

Xiaoyu Xu, Minxin Du, Zitong Li, Zi Liang, Zhibiao Guo, Shiyu Zhang, Peizhao Hu, Qingqing Ye, Haibo Hu• 2026

Related benchmarks

TaskDatasetResultRank
Language UnderstandingMMLU
Accuracy62.7
756
Knowledge UnlearningWMDP bio
Accuracy70.38
20
Knowledge UnlearningWMDP cyber
Accuracy46.9
16
UnlearningTOFU (forget01)
Forgetting Quality (F.Q.)92
10
Machine UnlearningHP book
VerbMem0.00e+0
6
Machine UnlearningTextbook
VerbMem0.0106
6
Machine UnlearningBiForget
VerbMem0.00e+0
6
Showing 7 of 7 rows

Other info

Follow for update