Speaker
Description
Normalizing Flows are a class of deep generative models recently proposed as a promising alternative to conventional Markov Chain Monte Carlo simulations to sample lattice field theory configurations, since they provide a unique approach to potentially avoid the large autocorrelations that characterize Monte Carlo simulations close to the continuum limit. In this talk we explore the novel concept of Stochastic Normalizing Flows (SNFs), in which neural-network layers are combined with traditional Monte Carlo updates: in particular, we show how SNFs share the same theoretical framework of out-of-equilibrium simulations based on Jarzynski's equality. We discuss how this connection can be exploited to optimize the efficiency of this extended class of generative models and we present some numerical results in the 2d $\phi^4$ scalar field theory.