Speaker
Mathis Gerdes
(University of Amsterdam)
Description
Recent advances in generative models have demonstrated the potential of normalizing flows for lattice field theory, particularly in mitigating critical slowing down and improving sampling efficiency. In this talk, I will discuss the role of continuous normalizing flows (CNF) or neural ODEs in learning field theories, highlighting their advantages and challenges compared to discrete flow architectures. CNFs enable expressive and scalable transformations while naturally incorporating symmetries of the theory. I will focus on the challenges and importance of equivariance and architectural choices, drawing from applications to both scalar and gauge theories.
AI keywords | equivariance; neural ODE; normalizing flows; sampling problem |
---|
Primary author
Mathis Gerdes
(University of Amsterdam)