16–20 Jun 2025
THotel, Cagliari, Sardinia, Italy
Europe/Rome timezone

Transfer Learning for Smart Background Simulation at Belle II

Not scheduled
20m
THotel, Cagliari, Sardinia, Italy

THotel, Cagliari, Sardinia, Italy

Via dei Giudicati, 66, 09131 Cagliari (CA), Italy
Poster + Flashtalk Simulations & Generative Models

Speaker

Mr David Giesegh (LMU Munich)

Description

In experimental particle physics, the development of analyses depends heavily on the accurate simulation of background processes, including both the particle collisions/decays and their subsequent interactions with the detector. However, for any specific analysis, a large fraction of these simulated events is discarded by a selection tailored to identifying interesting events to study. At Belle II, the simulation of the particle collision and subsequent particle decays is much more computationally efficient than the rest of the simulation and reconstruction chain. Thus, the computational cost of generating large simulated datasets for specific selections could be reduced by predicting which events will pass the selection before running the costly part of the simulation. Deep Neural Networks have shown promise in solving this task even when there is no obvious correlation between quantities available at this stage and after the full simulation. These models, however, must be trained on preexisting large datasets, which especially for selections with low efficiencies defeats their own purpose. To solve this issue, we present how a model, pre-trained on multiple different selections for which a lot of training data is available, can be fine-tuned on a small dataset for a new low-efficiency selection or for a known selection with different detector/software conditions.

AI keywords transformers; transfer learning; fine-tuning; classification

Primary author

Mr David Giesegh (LMU Munich)

Co-authors

Boyang Yu (LMU Munich) Dr Nikolai Krug (LMU Munich) Prof. Thomas Kuhr (LMU Munich)

Presentation materials

There are no materials yet.