Speaker
Qibin LIU
(TDLI., Shanghai JiaoTong University)
Description
We propose a new approach to learning powerful jet representations directly from unlabelled data. The method employs a Particle Transformer to predict masked particle representations in a latent space, overcoming the need for discrete tokenization and enabling it to extend to arbitrary input features beyond the Lorentz four-vectors. We demonstrate the effectiveness and flexibility of this method in several downstream tasks, including jet tagging and anomaly detection. Our approach provides a new path to a foundation model for particle physics.
Primary authors
Qibin LIU
(TDLI., Shanghai JiaoTong University)
Shudong WANG
(Institute of High Energy Physics, Chinese Academy of Sciences)
Huilin Qu
(CERN)