Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

CORRECT: Context- and Reference-Augmented Reasoning and Prompting for Fact-Checking

About

Fact-checking the truthfulness of claims usually requires reasoning over multiple evidence sentences. Oftentimes, evidence sentences may not be always self-contained, and may require additional contexts and references from elsewhere to understand coreferential expressions, acronyms, and the scope of a reported finding. For example, evidence sentences from an academic paper may need contextual sentences in the paper and descriptions in its cited papers to determine the scope of a research discovery. However, most fact-checking models mainly focus on the reasoning within evidence sentences, and ignore the auxiliary contexts and references. To address this problem, we propose a novel method, Context- and Reference-augmented Reasoning and Prompting. For evidence reasoning, we construct a three-layer evidence graph with evidence, context, and reference layers. We design intra- and cross-layer reasoning to integrate three graph layers into a unified evidence embedding. For verdict prediction, we design evidence-conditioned prompt encoder, which produces unique prompt embeddings for each claim. These evidence-conditioned prompt embeddings and claims are unified for fact-checking. Experiments verify the strength of our model.

Delvin Ce Zhang, Dongwon Lee• 2025

Related benchmarks

TaskDatasetResultRank
Claim VerificationAIChartClaim
Macro F170
38
Claim VerificationChartCheck
Macro F10.623
38
Claim VerificationMocheg
Macro F146
32
Claim VerificationMR2
Macro F170.4
32
Showing 4 of 4 rows

Other info

Follow for update