Generative AI for Bayesian Computation

Generative Bayesian Computation (GBC) provides a simulation-based approach to Bayesian inference. A Quantile Neural Network (QNN) is trained to map samples from a base distribution to the posterior distribution. Our method applies equally to parametric and likelihood-free models. By generating a lar...

Full description

Saved in:
Bibliographic Details
Main Authors: Nick Polson, Vadim Sokolov
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/27/7/683
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Generative Bayesian Computation (GBC) provides a simulation-based approach to Bayesian inference. A Quantile Neural Network (QNN) is trained to map samples from a base distribution to the posterior distribution. Our method applies equally to parametric and likelihood-free models. By generating a large training dataset of parameter–output pairs inference is recast as a supervised learning problem of non-parametric regression. Generative quantile methods have a number of advantages over traditional approaches such as approximate Bayesian computation (ABC) or GANs. Primarily, quantile architectures are density-free and exploit feature selection using dimensionality reducing summary statistics. To illustrate our methodology, we analyze the classic normal–normal learning model and apply it to two real data problems, modeling traffic speed and building a surrogate model for a satellite drag dataset. We compare our methodology to state-of-the-art approaches. Finally, we conclude with directions for future research.
ISSN:1099-4300