Speaker
Víctor Bresó Pla
(University of Heidelberg)
Description
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields
state-of-the-art performance for a wide range of machine learning tasks at the Large
Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is
equivariant under Lorentz transformations. The underlying architecture is a versatile
and scalable transformer, which is able to break symmetries if needed. We demonstrate
the power of L-GATr for amplitude regression and jet classification, and then benchmark
it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find
significant improvements over previous architectures.
AI keywords | Transformer; geometric deep learning; generative modeling; equivariant neural networks |
---|
Primary authors
Huilin Qu
(CERN)
Jesse Thaler
(MIT, IAIFI)
Jonas Spinner
(University of Heidelberg)
Tilman Plehn
(Heidelberg University, ITP)
Víctor Bresó Pla
(University of Heidelberg)