Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Modifying Memories in Transformer Models

About

Large Transformer models have achieved impressive performance in many natural language tasks. In particular, Transformer based language models have been shown to have great capabilities in encoding factual knowledge in their vast amount of parameters. While the tasks of improving the memorization and generalization of Transformers have been widely studied, it is not well known how to make transformers forget specific old facts and memorize new ones. In this paper, we propose a new task of \emph{explicitly modifying specific factual knowledge in Transformer models while ensuring the model performance does not degrade on the unmodified facts}. This task is useful in many scenarios, such as updating stale knowledge, protecting privacy, and eliminating unintended biases stored in the models. We benchmarked several approaches that provide natural baseline performances on this task. This leads to the discovery of key components of a Transformer model that are especially effective for knowledge modifications. The work also provides insights into the role that different training phases (such as pretraining and fine-tuning) play towards memorization and knowledge modification.

Chen Zhu, Ankit Singh Rawat, Manzil Zaheer, Srinadh Bhojanapalli, Daliang Li, Felix Yu, Sanjiv Kumar• 2020

Related benchmarks

TaskDatasetResultRank
Knowledge EditingzsRE
Generality13
110
Knowledge EditingCounterFact
Efficacy1.62e+3
91
Knowledge InsertionWikiData recent
Edit Success Rate23.94
43
Sequential Model EditingCounterFact
Efficacy92.15
24
Sequential Model EditingzsRE
Efficacy72.37
24
Multimodal Knowledge EditingMMQAKE Rephrased Image
M-Acc1.61
18
Multimodal Knowledge EditingMMQAKE Original Image
M-Acc1.66
18
Knowledge EditingUnKEBench Original questions
BERTScore44.02
18
Knowledge EditingUnKEBench Paraphrased questions (Para.)
Bert Score40.33
18
Knowledge EditingUnKEBench
Precision50.66
16
Showing 10 of 28 rows

Other info

Follow for update