Data reduction for Beam Dump eXperiment with Artificial Intelligence Algorithm

Not scheduled
20m
Aula Magna Lingotto (Torino)

Aula Magna Lingotto

Torino

Via Nizza 242, Torino
Poster and book of abstract

Speaker

Fabio Rossi (Istituto Nazionale di Fisica Nucleare)

Description

The Beam Dump eXperiment at Jefferson Lab plans to acquire Light Dark Matter data in streaming mode. With this new triggerless acquisition mode each front-end send all the data throws the net and the net must be manage “n” times the data rate of the single one and became very fast the bottleneck for SRO systems. In addition, this experiment needs also the waveform of each events available for the high level analysis.

The idea, to make compression effective and reduce the pressure on the ethernet devices is to insert a node after each digitizer board. The signals could be decompressed only at the end or even stored compressed to save memory space. This compression is possible because none of the network devices use the waveform or the charge information of the acquired signals.

We want to achieve a compression ratio in the order of 4 and sustain a data rate of 300kHz, almost impossible to achieve with the standard compression algorithm but feasible with the Autoencoder: an AI based algorithm.

This works describes the autoencoder development and some tests on different hardware: CPU, GPU and FPGA.

Primary authors

Fabio Rossi (Istituto Nazionale di Fisica Nucleare) Marco Battaglieri (Istituto Nazionale di Fisica Nucleare)

Presentation materials