-
Giuseppe Negro11/05/2026, 09:00Calcolo teorico e degli esperimentiPresentazione orale
Active emulsions, droplets of liquid crystalline active matter encapsulating passive isotropic cores, represent a versatile and experimentally realisable class of topological active matter. Here, we present results from three-dimensional Lattice Boltzmann (LB) simulations of multiphase active nematic emulsions, developed and validated as part of the use case "Large Scale Simulations of Complex...
Go to contribution page -
Valentina Camagni (Universita & INFN, Milano-Bicocca (IT))11/05/2026, 09:20Calcolo teorico e degli esperimentiPresentazione orale
The Compact Muon Solenoid (CMS) experiment at the CERN LHC has traditionally relied on a highly selective Level-1 trigger to reduce the 40 MHz LHC collision rate to a rate more manageable for data-reading and recording. During LHC Run 3, CMS deployed a novel 40 MHz data acquisition system that enables the continuous readout and real-time processing of L1 trigger-level detector data at the full...
Go to contribution page -
Davide Fuligno (Istituto Nazionale di Fisica Nucleare)11/05/2026, 09:40Calcolo teorico e degli esperimentiPresentazione orale
Fast Simulation of the ALICE Zero Degree Calorimeter using Generative Models
Davide Fuligno
On behalf of the ALICE Collaboration
Università di Pisa and INFN, Trieste ItalyThe ALICE experiment at the LHC faces unprecedented computing challenges in Run 3 and 4, necessitating innovative solutions to cope with the increased data-taking luminosity and the continuous readout. A critical...
Go to contribution page -
Gabriele Cimador (Istituto Nazionale di Fisica Nucleare)11/05/2026, 10:00Calcolo teorico e degli esperimentiPresentazione orale
On behalf of the ALICE collaboration
Go to contribution page
The ALICE experiment has a long-standing expertise in GPU computing for high-energy physics, having used hardware accelerators since Run 2. With Run 3, ALICE changed its data acquisition model to triggerless mode, relying even more heavily on GPUs to handle the data throughput produced by continuous detector readout. During Pb-Pb collisions today, Alice... -
Emidio Maria Giorgio (Istituto Nazionale di Fisica Nucleare), Riccardo Bruno (Istituto Nazionale di Fisica Nucleare)11/05/2026, 10:20Calcolo teorico e degli esperimentiPresentazione orale
The KM3NeT Collaboration is building a large-scale neutrino research infrastructure in the Mediterranean Sea, comprising the ARCA detector, in Italy, and the ORCA detector in France. Within this framework, INFN plays a central role in the computing and data management strategy supporting both detectors.
The Tier-0 facilities, hosted at the detector sites (Portopalo di Capo Passero for ARCA...
Go to contribution page -
Dr Alessandro Colombi (Istituto Nazionale di Fisica Nucleare)11/05/2026, 10:40Calcolo teorico e degli esperimentiPresentazione orale
The optimization of Boron Neutron Capture Therapy (BNCT) requires an accurate computational framework capable of integrating complex neutron source characterization with precise energy deposition modeling. In this context and within the activities of the Geant4INFN project, this work presents recent advances of Geant4 simulations, using the Multi-Threading (MT) architecture and exploiting both...
Go to contribution page -
Matteo Bresciani (Trinity College Dublin)11/05/2026, 11:30Calcolo teorico e degli esperimentiPresentazione orale
Quantum Chromodynamics (QCD) is the theory that describes the strong interactions among fundamental particles. Its highly complex dynamics cannot be solved analytically, and a predictive approach relies on numerical simulations in which space-time is discretized on a four-dimensional lattice and the theory is solved using Monte Carlo methods. These calculations are well suited for...
Go to contribution page -
Francesca Margari (University and INFN of Roma Tor Vergata)11/05/2026, 11:45Calcolo teorico e degli esperimentiPresentazione orale
In the last decade, advances in theoretical and algorithmic methods together with the growth of computational resources have made lattice QCD a key tool at the precision frontier of particle physics, enabling systematically improvable predictions for observables measured with increasing experimental accuracy. Within this context, we present the development of software for lattice QCD and...
Go to contribution page -
Antonio Falabella (Istituto Nazionale di Fisica Nucleare)11/05/2026, 12:00Calcolo teorico e degli esperimentiPresentazione orale
The LHCb HLT2 Farm comprises ~4,500 nodes and ~260k CPU cores, primarily dedicated to real-time data processing. Extended idle periods during technical stops represent a significant, under exploited computing opportunity.
Go to contribution page
We present an HTCondor-based opportunistic computing model that integrates the HLT2 Farm into the LHCb distributed computing infrastructure while preserving strict priority... -
Lisa Zangrando (Istituto Nazionale di Fisica Nucleare)11/05/2026, 12:15Calcolo teorico e degli esperimentiPresentazione orale
The Cherenkov Telescope Array Observatory (CTAO) is the next-generation ground-based observatory for very-high-energy gamma-ray astronomy. The observatory will produce large volumes of scientific data, requiring a distributed computing infrastructure capable of supporting data processing, storage, and analysis for a geographically distributed scientific community.
The CTAO computing model...
Go to contribution page -
Stefano Dal Pra (Istituto Nazionale di Fisica Nucleare)11/05/2026, 12:30Calcolo teorico e degli esperimentiPresentazione orale
Si descrive il modello di calcolo che ha consentito l'esecuzione di grosse
Go to contribution page
campagne di CW search su diverse tipologie di Calcolo batch, Grid e HPC (CNAF, IGWN, ICSC). Durante le fasi di sviluppo e di impiego del framework, diversi paradigmi di fruizione delle risorse sono mutati, portando a implementare soluzioni flessibili, in grado di adattarsi alle differenti modalita' di ciascun provider.... -
Dr Camilla Scapicchio (Istituto Nazionale di Fisica Nucleare, Sezione di Pisa)11/05/2026, 12:45Calcolo teorico e degli esperimentiPresentazione orale
Negli ultimi anni, l’analisi data-driven è diventata un elemento centrale nella ricerca biomedica. Tuttavia, la gestione di dataset di grandi dimensioni, eterogenei e distribuiti su più centri rappresenta ancora una sfida significativa, soprattutto in termini di integrazione, tracciabilità e riproducibilità dei workflow. In questo contesto, presentiamo una piattaforma computazionale sviluppata...
Go to contribution page
Choose timezone
Your profile timezone: