Normalizing flows for high-dimensional detector simulations

Whenever invertible generative networks are needed for LHC physics, normalizing flows show excellent performance. In this work, we investigate their performance for fast calorimeter shower simulations with increasing phase space dimension. We use fast and expressive coupling spline transformations a...

Full description

Saved in:
Bibliographic Details
Main Author: Florian Ernst, Luigi Favaro, Claudius Krause, Tilman Plehn, David Shih
Format: Article
Language:English
Published: SciPost 2025-03-01
Series:SciPost Physics
Online Access:https://scipost.org/SciPostPhys.18.3.081
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Whenever invertible generative networks are needed for LHC physics, normalizing flows show excellent performance. In this work, we investigate their performance for fast calorimeter shower simulations with increasing phase space dimension. We use fast and expressive coupling spline transformations applied to the CaloChallenge datasets. In addition to the base flow architecture we also employ a VAE to compress the dimensionality and train a generative network in the latent space. We evaluate our networks on several metrics, including high-level features, classifiers, and generation timing. Our findings demonstrate that invertible neural networks have competitive performance when compared to autoregressive flows, while being substantially faster during generation.
ISSN:2542-4653