Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

CFNN: Continued Fraction Neural Network

About

Accurately characterizing non-linear functional manifolds with singularities is a fundamental challenge in scientific computing. While Multi-Layer Perceptrons (MLPs) dominate, their spectral bias hinders resolving high-curvature features without excessive parameters. We introduce Continued Fraction Neural Networks (CFNNs), integrating continued fractions with gradient-based optimization to provide a ``rational inductive bias.'' This enables capturing complex asymptotics and discontinuities with extreme parameter frugality. We provide formal approximation bounds demonstrating exponential convergence and stability guarantees. To address recursive instability, we develop three implementations: CFNN-Boost, CFNN-MoE, and CFNN-Hybrid. Benchmarks show CFNNs consistently outperform MLPs in precision with one to two orders of magnitude fewer parameters, exhibiting up to a 47-fold improvement in noise robustness and physical consistency. By bridging black-box flexibility and white-box transparency, CFNNs establish a reliable ``grey-box'' paradigm for AI-driven scientific research.

Chao Wang, Xuancheng Zhou, Ruilin Hou, Xiaoyu Cheng, Ruiyi Ding• 2026

Related benchmarks

TaskDatasetResultRank
ClassificationWaveform real-world (test)
Accuracy89
7
ClassificationMagic real-world (test)
Accuracy88
7
ClassificationCredit Card real-world (test)
Accuracy82
7
ClassificationSentiment real-world (test)
Accuracy88
7
ClassificationQuora real-world (test)
Accuracy94
7
ClassificationCIFAR10 real-world (test)
Accuracy47
7
Thermal-load regressionUCI Energy Efficiency single-target thermal-load regression
MSE7.13
5
Feature Importance Ranking AlignmentUCI Energy Efficiency
Spearman Correlation0.6944
4
Showing 8 of 8 rows

Other info

Follow for update