Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

An AI-powered Bayesian Generative Modeling Approach for Arbitrary Conditional Inference

About

Modern data analysis increasingly requires flexible conditional inference P(X_B | X_A) where (X_A, X_B) is an arbitrary partition of observed variable X. Existing approaches are either restricted to a fixed conditioning structure or depend strongly on the distribution of conditioning masks during training. To address these limitations, we introduce Bayesian generative modeling (BGM), a unified framework for arbitrary conditional inference. BGM learns a generative model of X via a stochastic iterative Bayesian updating algorithm in which model parameters and latent variables are updated until convergence. Once trained, any conditional distribution can be obtained without retraining. Empirically, BGM achieves superior predictive performance with posterior predictive intervals, demonstrating that a single learned model can serve as a universal engine for conditional prediction with principled uncertainty quantification. We provide theoretical guarantees for convergence of the stochastic iterative algorithm, statistical consistency, and conditional risk bounds. The proposed BGM framework leverages modern AI to capture complex relationships among variables while adhering to Bayesian principles, offering a promising approach for a wide range of applications in modern data science. Code for BGM is available at https://github.com/liuq-lab/bayesgm. Document of BGM is available at https://bayesgm.readthedocs.io.

Qiao Liu, Wing Hung Wong• 2026

Related benchmarks

TaskDatasetResultRank
Interval EstimationSimulated p=50 (test)
PCC0.937
9
Interval EstimationSimulated p=100 (test)
PCC0.874
9
Interval EstimationSimulated p=300 (test)
PCC0.863
9
Point EstimationSimulated Data (p=50) (test)
MSE0.167
5
Point EstimationSimulated Data p=100 (test)
MSE0.193
5
Point EstimationSimulated Data p=300 (test)
MSE0.181
5
Showing 6 of 6 rows

Other info

Follow for update