Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Everything is Editable: Extend Knowledge Editing to Unstructured Data in Large Language Models

About

Recent knowledge editing methods have primarily focused on modifying structured knowledge in large language models. However, this task setting overlooks the fact that a significant portion of real-world knowledge is stored in an unstructured format, characterized by long-form content, noise, and a complex yet comprehensive nature. Techniques like "local layer key-value storage" and "term-driven optimization", as used in previous methods like MEMIT, are not effective for handling unstructured knowledge. To address these challenges, we propose a novel Unstructured Knowledge Editing method, namely UnKE, which extends previous assumptions in the layer dimension and token dimension. Firstly, in the layer dimension, we propose non-local block key-value storage to replace local layer key-value storage, increasing the representation ability of key-value pairs and incorporating attention layer knowledge. Secondly, in the token dimension, we replace "term-driven optimization" with "cause-driven optimization", which edits the last token directly while preserving context, avoiding the need to locate terms and preventing the loss of context information. Results on newly proposed unstructured knowledge editing dataset (UnKEBench) and traditional structured datasets demonstrate that UnKE achieves remarkable performance, surpassing strong baselines. In addition, UnKE has robust batch editing and sequential editing capabilities.

Jingcheng Deng, Zihao Wei, Liang Pang, Hanxing Ding, Huawei Shen, Xueqi Cheng• 2024

Related benchmarks

TaskDatasetResultRank
Knowledge EditingUnKEBench Original questions
BERTScore98.34
18
Knowledge EditingUnKEBench Paraphrased questions (Para.)
Bert Score93.38
18
CompletionAKEW Completion
Precision26.06
16
Question AnsweringAKEW
Precision37.76
16
Knowledge EditingUnKEBench
Precision19.71
16
Unstructured Knowledge EditingAKEW Com.
ROUGE-L Precision22.4
16
Unstructured Knowledge EditingAKEW-QA
ROUGE-L Precision33.67
16
Unstructured Knowledge EditingUnKEBench
Precision (ROUGE-L)15.95
16
Sequential Knowledge EditingzsRE
Efficacy0.4658
12
Knowledge EditingMQuAKE
Average Accuracy0.3401
8
Showing 10 of 13 rows

Other info

Follow for update