Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Bayesian Symbolic Regression

About

Interpretability is crucial for machine learning in many scenarios such as quantitative finance, banking, healthcare, etc. Symbolic regression (SR) is a classic interpretable machine learning method by bridging X and Y using mathematical expressions composed of some basic functions. However, the search space of all possible expressions grows exponentially with the length of the expression, making it infeasible for enumeration. Genetic programming (GP) has been traditionally and commonly used in SR to search for the optimal solution, but it suffers from several limitations, e.g. the difficulty in incorporating prior knowledge; overly-complicated output expression and reduced interpretability etc. To address these issues, we propose a new method to fit SR under a Bayesian framework. Firstly, Bayesian model can naturally incorporate prior knowledge (e.g., preference of basis functions, operators and raw features) to improve the efficiency of fitting SR. Secondly, to improve interpretability of expressions in SR, we aim to capture concise but informative signals. To this end, we assume the expected signal has an additive structure, i.e., a linear combination of several concise expressions, whose complexity is controlled by a well-designed prior distribution. In our setup, each expression is characterized by a symbolic tree, and the proposed SR model could be solved by sampling symbolic trees from the posterior distribution using an efficient Markov chain Monte Carlo (MCMC) algorithm. Finally, compared with GP, the proposed BSR(Bayesian Symbolic Regression) method saves computer memory with no need to keep an updated 'genome pool'. Numerical experiments show that, compared with GP, the solutions of BSR are closer to the ground truth and the expressions are more concise. Meanwhile we find the solution of BSR is robust to hyper-parameter specifications such as the number of trees.

Ying Jin, Weilin Fu, Jian Kang, Jiadong Guo, Jian Guo• 2019

Related benchmarks

TaskDatasetResultRank
Symbolic RegressionSRBench black-box (test)
R^20.2725
28
Symbolic RegressionStrogatz Dataset ϵ = 0.0 (test)
R^20.8455
20
Symbolic RegressionStrogatz Dataset epsilon=0.001 (test)
R2 Score0.8224
20
Symbolic RegressionStrogatz Dataset epsilon=0.01 (test)
R2 Score0.8127
20
Symbolic RegressionStrogatz Dataset epsilon=0.1 (test)
R271.9
20
Symbolic RegressionFeynman Dataset epsilon=0.1 (test)
R2 Score0.6567
20
Symbolic RegressionFeynman Dataset epsilon=0.001 (test)
R265.38
20
Symbolic RegressionFeynman Dataset epsilon=0.01 (test)
R20.6734
20
Symbolic RegressionFeynman Dataset ϵ = 0.0 (test)
R^20.6609
20
Showing 9 of 9 rows

Other info

Follow for update