Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Faithful Logical Reasoning via Symbolic Chain-of-Thought

About

While the recent Chain-of-Thought (CoT) technique enhances the reasoning ability of large language models (LLMs) with the theory of mind, it might still struggle in handling logical reasoning that relies much on symbolic expressions and rigid deducing rules. To strengthen the logical reasoning capability of LLMs, we propose a novel Symbolic Chain-of-Thought, namely SymbCoT, a fully LLM-based framework that integrates symbolic expressions and logic rules with CoT prompting. Technically, building upon an LLM, SymbCoT 1) first translates the natural language context into the symbolic format, and then 2) derives a step-by-step plan to solve the problem with symbolic logical rules, 3) followed by a verifier to check the translation and reasoning chain. Via thorough evaluations on 5 standard datasets with both First-Order Logic and Constraint Optimization symbolic expressions, SymbCoT shows striking improvements over the CoT method consistently, meanwhile refreshing the current state-of-the-art performances. We further demonstrate that our system advances in more faithful, flexible, and explainable logical reasoning. To our knowledge, this is the first to combine symbolic expressions and rules into CoT for logical reasoning with LLMs. Code is open at https://github.com/Aiden0526/SymbCoT.

Jundong Xu, Hao Fei, Liangming Pan, Qian Liu, Mong-Li Lee, Wynne Hsu• 2024

Related benchmarks

TaskDatasetResultRank
Logical reasoningFOLIO (test)
Accuracy84.46
58
Logical reasoningProofWriter (test)
Accuracy88.34
36
Logical reasoningProntoQA (test)
Accuracy98.47
36
Logical reasoningAR-LSAT (test)
Accuracy70.87
24
Logical reasoningDeduction (test)
Accuracy99.03
20
Showing 5 of 5 rows

Other info

Follow for update