10–11 Dec 2025
LNF
Europe/Rome timezone

An unsupervised deep learning approach for rare signal detection in astroparticle experiments: the use-case DAIDREAM

Not scheduled
5m
LNF ed.36 - B. Touschek (LNF)

LNF ed.36 - B. Touschek

LNF

288
Show room on map
Physical Poster shown at the Meeting POSTER AND VIDEO UPLOAD

Description

In the era of multi-messenger astronomy, where data from gamma- and cosmic-ray observatories, neutrino telescopes, gravitational interferometers, and dark matter experiments are increasingly combined, novel approaches are necessary to drive progress in astroparticle physics. The growing complexity and interconnectivity of experimental datasets demand robust solutions for data sharing, distribution, and analysis, powered by high-performance computing infrastructures and advanced machine learning methods.
This contribution reports on the research conducted within the Spoke 2 as part of the activities in the Working Package 3 (“Applications for experimental astroparticle physics and gravitational waves experiments”) under the use-case DAIDREAM (“DAta-driven IDentification of Rare Events in Astroparticle physics through Machine learning techniques”). Our work aims to leverage unsupervised deep neural networks for the specific challenges of the field, and it was developed and validated in two distinct but complementary experimental settings, demonstrating the versatility of data-driven approaches for rare event identification and characterization across diverse contexts.

A first application focused on enhancing sensitivity to low signal pulses in long waveform acquisitions. We developed a tool based on Convolutional Autoencoders that performs strong compression of waveform data, enabling a direct analysis of characteristic features in the resulting latent space. This technique has been applied on measurements from the Recoil Directionality (ReD) experiment, which comprises a compact dual-phase Liquid Argon Time Projection Chamber (5×5×6 cm³) built within the DarkSide project. The trained model allows for robust identification of delayed electroluminescence (S2) signals, arising from nuclear recoils at energies down to a few keV, with a sensitivity at least comparable with conventional methods. Additionally, we tested this methodology on a synthetic dataset mimicking ReD waveforms, employing Variational Autoencoders to enhance model flexibility. This approach achieves optimal classification for events with large signal amplitudes, with the accuracy only decreasing when the signals become indistinguishable from background fluctuations.

A second application concerns the field of ultra-high energy cosmic rays (UHECR, namely protons and nuclei above 100 PeV) in the context of the Pierre Auger Observatory. Identifying rare or anomalous events – e.g. from primary UHE photons or neutrinos – could provide evidence for new physics or links to dark matter, but no convincing candidates have been found so far. We addressed this challenge by training a Convolutional Autoencoder on standard hadronic events, forcing the model to efficiently learn their typical signatures. In contrast, photon- and neutrino-induced events are expected to be poorly reproduced and thus flagged as anomalies. Our studies on simulated data confirm that this data-driven, unsupervised strategy can distinguish among different classes of events, representing a promising step towards improving sensitivity for future searches of exotic cosmic messengers.

Authors

Alessia Rita Tricomi (Istituto Nazionale di Fisica Nucleare) Claudio Orazio De Maria (Università di Catania) Gioacchino Alex Anastasi (Università di Catania & INFN Catania) Marzio De Napoli (Istituto Nazionale di Fisica Nucleare) Dr Noemi Pino (Istituto Nazionale di Fisica Nucleare - Laboratori Nazionali del Sud) Paules Alkess Zakhary (Istituto Nazionale di Fisica Nucleare) Rossella Caruso (Istituto Nazionale di Fisica Nucleare) Sebastiana Maria Puglia (Istituto Nazionale di Fisica Nucleare) Sebastiano Albergo (Istituto Nazionale di Fisica Nucleare)

Presentation materials

There are no materials yet.