16–20 Jun 2025
THotel, Cagliari, Sardinia, Italy
Europe/Rome timezone

🎙️ A Lorentz-Equivariant Transformer for All of the LHC

18 Jun 2025, 15:26
20m
T1a+T1b

T1a+T1b

Parallel talk Simulations & Generative Models 🔀 Simulations & Generative Models

Speaker

Víctor Bresó Pla (University of Heidelberg)

Description

We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields
state-of-the-art performance for a wide range of machine learning tasks at the Large
Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is
equivariant under Lorentz transformations. The underlying architecture is a versatile
and scalable transformer, which is able to break symmetries if needed. We demonstrate
the power of L-GATr for amplitude regression and jet classification, and then benchmark
it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find
significant improvements over previous architectures.

AI keywords Transformer; geometric deep learning; generative modeling; equivariant neural networks

Primary authors

Huilin Qu (CERN) Jesse Thaler (MIT, IAIFI) Jonas Spinner (University of Heidelberg) Tilman Plehn (Heidelberg University, ITP) Víctor Bresó Pla (University of Heidelberg)

Presentation materials