Speaker
Description
KM3NeT is a new research infrastructure housing the next generation of neutrino telescopes in the Mediterranean deep sea. This facility comprises two detectors: KM3NeT/ARCA and KM3NeT/ORCA, consisting of vertically-arranged detection units, 230 and 115, respectively, each equipped with 18 digital optical modules. The photomultipliers within each optical module detect Cherenkov light emitted by charged particles propagating in the seawater. KM3NeT/ARCA is optimized for the search of astrophysical neutrino sources in the range of TeV to PeV; whereas KM3NeT/ORCA is used to study the neutrino oscillation phenomena in the 1-100 GeV energy range.
The current KM3NeT/ORCA telescope, with 24 deployed detection units, is still under construction and has not yet reached its full potential in neutrino reconstruction capability. When training any deep learning model, no explicit information about the physics nor the detector is provided, nor is it already embedded in the data, thus remaining unknown to the model. This study demonstrates the efficacy of transformer models, as large representation models, on retaining valuable information from the simulations of the complete detector when evaluating data from various smaller KM3NeT/ORCA configurations. The study leverages the strengths of transformers, with respect to other models, by incorporating attention masks inspired by the physics and detector design. This allows to filter out irrelevant background light pulses and focusing on those resulting from a neutrino interaction, at the same time it captures the physics measured on the telescope.
AI keywords | transformers, transfer learning, multi-task inference |
---|