Stabilizing training of affine coupling layers for high-dimensional variational inference
Variational inference with normalizing flows is an increasingly popular alternative to MCMC methods. In particular, normalizing flows based on affine coupling layers (Real NVPs) are frequently used due to their good empirical performance. In theory, increasing the depth of normalizing flows should l...
Saved in:
| Main Author: | |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IOP Publishing
2024-01-01
|
| Series: | Machine Learning: Science and Technology |
| Subjects: | |
| Online Access: | https://doi.org/10.1088/2632-2153/ad9a39 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850255306937860096 |
|---|---|
| author | Daniel Andrade |
| author_facet | Daniel Andrade |
| author_sort | Daniel Andrade |
| collection | DOAJ |
| description | Variational inference with normalizing flows is an increasingly popular alternative to MCMC methods. In particular, normalizing flows based on affine coupling layers (Real NVPs) are frequently used due to their good empirical performance. In theory, increasing the depth of normalizing flows should lead to more accurate posterior approximations. However, in practice, training deep normalizing flows for approximating high-dimensional posterior distributions is often infeasible due to the high variance of the stochastic gradients. In this work, we show that previous methods for stabilizing the variance of stochastic gradient descent can be insufficient to achieve stable training of Real NVPs. As the source of the problem, we identify that, during training, samples often exhibit unusual high values. As a remedy, we propose a combination of two methods: (1) soft-thresholding of the scale in Real NVPs, and (2) a bijective soft log transformation of the samples. We evaluate these and other previously proposed modification on several challenging target distributions, including a high-dimensional horseshoe logistic regression model. Our experiments show that with our modifications, stable training of Real NVPs for posteriors with several thousand dimensions and heavy tails is possible, allowing for more accurate marginal likelihood estimation via importance sampling. Moreover, we evaluate several common training techniques and architecture choices and provide practical advise for training Real NVPs for high-dimensional variational inference. Finally, we also provide new empirical and theoretical justification that optimizing the evidence lower bound of normalizing flows leads to good posterior distribution coverage. |
| format | Article |
| id | doaj-art-c1b8fbdcd66c433387d43adb4fd17a89 |
| institution | OA Journals |
| issn | 2632-2153 |
| language | English |
| publishDate | 2024-01-01 |
| publisher | IOP Publishing |
| record_format | Article |
| series | Machine Learning: Science and Technology |
| spelling | doaj-art-c1b8fbdcd66c433387d43adb4fd17a892025-08-20T01:56:55ZengIOP PublishingMachine Learning: Science and Technology2632-21532024-01-015404506610.1088/2632-2153/ad9a39Stabilizing training of affine coupling layers for high-dimensional variational inferenceDaniel Andrade0https://orcid.org/0000-0002-1123-4369School of Informatics and Data Science, Hiroshima University , Higashi-Hiroshima City, Hiroshima, JapanVariational inference with normalizing flows is an increasingly popular alternative to MCMC methods. In particular, normalizing flows based on affine coupling layers (Real NVPs) are frequently used due to their good empirical performance. In theory, increasing the depth of normalizing flows should lead to more accurate posterior approximations. However, in practice, training deep normalizing flows for approximating high-dimensional posterior distributions is often infeasible due to the high variance of the stochastic gradients. In this work, we show that previous methods for stabilizing the variance of stochastic gradient descent can be insufficient to achieve stable training of Real NVPs. As the source of the problem, we identify that, during training, samples often exhibit unusual high values. As a remedy, we propose a combination of two methods: (1) soft-thresholding of the scale in Real NVPs, and (2) a bijective soft log transformation of the samples. We evaluate these and other previously proposed modification on several challenging target distributions, including a high-dimensional horseshoe logistic regression model. Our experiments show that with our modifications, stable training of Real NVPs for posteriors with several thousand dimensions and heavy tails is possible, allowing for more accurate marginal likelihood estimation via importance sampling. Moreover, we evaluate several common training techniques and architecture choices and provide practical advise for training Real NVPs for high-dimensional variational inference. Finally, we also provide new empirical and theoretical justification that optimizing the evidence lower bound of normalizing flows leads to good posterior distribution coverage.https://doi.org/10.1088/2632-2153/ad9a39normalizing flowsvariational inferencemarginal likelihoodhorseshoe priorBayesian Lassologistic regression |
| spellingShingle | Daniel Andrade Stabilizing training of affine coupling layers for high-dimensional variational inference Machine Learning: Science and Technology normalizing flows variational inference marginal likelihood horseshoe prior Bayesian Lasso logistic regression |
| title | Stabilizing training of affine coupling layers for high-dimensional variational inference |
| title_full | Stabilizing training of affine coupling layers for high-dimensional variational inference |
| title_fullStr | Stabilizing training of affine coupling layers for high-dimensional variational inference |
| title_full_unstemmed | Stabilizing training of affine coupling layers for high-dimensional variational inference |
| title_short | Stabilizing training of affine coupling layers for high-dimensional variational inference |
| title_sort | stabilizing training of affine coupling layers for high dimensional variational inference |
| topic | normalizing flows variational inference marginal likelihood horseshoe prior Bayesian Lasso logistic regression |
| url | https://doi.org/10.1088/2632-2153/ad9a39 |
| work_keys_str_mv | AT danielandrade stabilizingtrainingofaffinecouplinglayersforhighdimensionalvariationalinference |