The Milano-Bicocca FPGA cluster represents an incredible opportunity to explore different use cases for the Physics program of the CMS experiment upgrade foreseen for the High Luminosity LHC. In this talk I will describe the FPGA Cluster that will be built in Milano-Bicocca in the context of the ICSC Spoke-2 project, and describe one of the many physics use case that may benefit from an FPGA...
A multi-TeV muon collider proves to be very efficient not only for the search for new heavy neutral particles, but also for the discovery of charged bosons of the W′ type. We find that, by analyzing the associated production with a Standard Model W, charged resonances can be probed directly up to multi-TeV mass values close to the collision energy, and for very small couplings with the SM...
High energy physics research has always leveraged bleeding edge
computing solutions. Now, as experiments turn their focus to high precision measurements, data need to be understood and analyzed better than ever; this requires, among other things, new coding paradigms that enforce reproducibility and improve usability.
In this talk I will show how it is possible to enable continuous...
The Level-1 trigger Data Scouting (L1DS) is a novel data acquisition system under development for the Phase-2 CMS detector at the High-Luminosity LHC (HL-LHC). Its purpose is to capture and process L1 trigger information at the bunch crossing frequency of the LHC preceding the L1 accept. It has the potential for filterless detector diagnostics, luminosity studies and investigations into...
The talk will be focused to present the computing infrastructure under development as part of the WP5 activities. The aim is to provide the users with an infrastructure that represents a tradeoff between deployment speed-flexibility, resource efficiency and service performance, what we call analysis facility.
In order to offer a general-purpose infrastructure, we leveraged container...
The challenges expected for the future colliders era are pushing to re think the HEP computing models at many levels.
A simple use case tested on the INFN analysis facility will be presented in the context of WP5, exploiting FCCee simulations.
The presented work will provide an overview of the main technologies involved and will describe the results of a first benchmark using IDEA detector...
The ever-growing demand for fast processing of large datasets, as in the upcoming high-luminosity phases at the Large Hadron Collider (LHC), paves the way for innovative approaches. Leveraging the ICSC cloud DataLake model and integrating ongoing experiences in High Energy Physics (HEP) , a path towards an Analysis Facility (AF) is being forged. This new paradigm of data analysis moves from a...
Images acquired from aircrafts also integrate with satellite-based remote sensing allows for high-resolution data collection essential for ecosystems monitoring and risk management. This approach, combined with Artificial Intelligence (AI) algorithms serves as a reliable tool for the calibration and validation of satellite-derived data and ensures ground-truthing capabilities for more accurate...
Gravitational waves (GWs) from compact binary coalescences can be used as a new and independent cosmological probe if external binary redshift information is injected into the inference process. Methods for incorporating redshift information range from direct detection of electromagnetic counterparts ("bright sirens") to statistical inference of binary redshift using a catalog of possible...
Automatically detecting the area of photovoltaic panels in images gives the possibility to forecast and plan the green energy production in a community. Most existing approaches for panel detection resort to machine learning to analyse images and find the photovoltaic panels. However, each geographical area is likely to have its own sorrounding/background colours and panel colours, as the...
The recent observation of gravitational waves from merging binary systems of compact astrophysical objects has opened a new window to explore the Universe. A strong effort is still ongoing to detect signals from different sources, like rotating isolated neutron stars, which are expected to produce continuous, persistent gravitational waves. In this talk, I will show that those searches are...
Numerical simulations of lattice Quantum ChromoDynamics offer a non perturbative approach from first principles to compute the properties of the theory of strong interactions. The design of efficient algorithms and the increasing computing power of latest and future generation HPC systems allow to push simulations to more interesting (and challenging) regimes. Within the context of the...
Complex systems refer to a class of interconnected entities or components whose collective behavior cannot be easily deduced from the properties of their individual parts. These systems are characterized by a high degree of interdependence, non-linear relationships, and emergent properties that arise from the interactions and feedback loops among the elements. Our primary focus will be on the...
The Geant4 toolkit is a widely used particle transport code for the simulation of high energy space missions, enabling the evaluation of their performance and driving the instrument design. The evolving landscape of modern large-scale simulations provides a new challenge in managing the production of increasingly large datasets, along with memory and computing requirements, that leads to the...
The current High Rate Analysis platform that is being implemented offers a general purpose environment where analyzers can scale up computations within the size of the instantiated cluster that possibly scale within the provider. However, in order to handle potentially a huge amount of users with diverse use cases, we plan to evolve the general purpose infrastructure toward the offloading...
The software toolbox used for "big data" analysis in the last few years is changing fast. The adoption of approaches able to exploit the new hardware architectures plays a pivotal role in boosting data processing speed, resources optimisation, analysis portability and analysis preservation.
The big scientific collaborations (ATLAS, CMS, LHCb, Alice, ...) are devoting increasing resources to...
We discuss a computational approch to the analysis of the 331 Model in Frampton's version with
. The model generalises the
weak sector of the Standard Model to
and it predicts exactly three generations through the cancellation of gauge anomalies within an inter-generational framework. One additional feature of the model is the presence of Bileptons, which are gauge bosons of charge
,...
The Advanced Calculus for Precision Physics (ACPP) use case is an initiative dedicated to enhance the computation of scattering amplitudes and cross-sections to support the phenomenology analyses for prospects of detection and observations of new physics events in advanced collider physics programs (such as CERN and Fermilab) and gravitational wave detectors (including LIGO-VIRGO-KAGRA, ET,...
A strong effort is ongoing to develop and improve unmodeled methods for detecting generic GW signals, including those for which we still miss precise modelling, such as the burst events produced by supernovae, magnetar flares, fast radio bursts..
In this context, coherent WaveBurst (cWB) is currently the most efficient and utilised burst pipeline in the LVK Collaboration. cWB is based on a...
Machine Learning algorithms bring a new opportunity of investigation and analysis of phenomena in the context of Astroparticle Physics and multi-messenger Astrophysics. We focus here on Water Cherenkov detectors, such as Super-Kamiokande and Hyper-Kamiokande, which offer a low noise environment ideal for the study of neutrinos from astrophysical sources, but also for the detection of rare...
The Advanced Calculus for Precision Physics (ACPP) use case is an initiative dedicated to enhance the computation of scattering amplitudes and cross-sections to support the phenomenology analyses for prospects of detection and observations of new physics events in advanced collider physics programs (such as CERN and Fermilab) and gravitational wave detectors (including LIGO-VIRGO-KAGRA, ET,...
The recent observation of gravitational waves from merging binary systems of compact astrophysical objects has opened a new window to explore the Universe. A strong effort is still ongoing to detect signals from different sources, like rotating isolated neutron stars, which are expected to produce continuous, persistent gravitational waves. In this talk, I will show that those searches are...
The challenges expected for the future colliders era are pushing to re think the HEP computing models at many levels.
A simple use case tested on the INFN analysis facility will be presented in the context of WP5, exploiting FCCee simulations.
The presented work will provide an overview of the main technologies involved and will describe the results of a first benchmark using IDEA detector...
The Geant4 toolkit is a widely used particle transport code for the simulation of high energy space missions, enabling the evaluation of their performance and driving the instrument design. The evolving landscape of modern large-scale simulations provides a new challenge in managing the production of increasingly large datasets, along with memory and computing requirements, that leads to the...
The Level-1 trigger Data Scouting (L1DS) is a novel data acquisition system under development for the Phase-2 CMS detector at the High-Luminosity LHC (HL-LHC). Its purpose is to capture and process L1 trigger information at the bunch crossing frequency of the LHC preceding the L1 accept. It has the potential for filterless detector diagnostics, luminosity studies and investigations into...
The software toolbox used for "big data" analysis in the last few years is changing fast. The adoption of approaches able to exploit the new hardware architectures plays a pivotal role in boosting data processing speed, resources optimisation, analysis portability and analysis preservation.
The big scientific collaborations (ATLAS, CMS, LHCb, Alice, ...) are devoting increasing resources to...
Images acquired from aircrafts also integrate with satellite-based remote sensing allows for high-resolution data collection essential for ecosystems monitoring and risk management. This approach, combined with Artificial Intelligence (AI) algorithms serves as a reliable tool for the calibration and validation of satellite-derived data and ensures ground-truthing capabilities for more accurate...
The talk will be focused to present the computing infrastructure under development as part of the WP5 activities. The aim is to provide the users with an infrastructure that represents a tradeoff between deployment speed-flexibility, resource efficiency and service performance, what we call analysis facility.
In order to offer a general-purpose infrastructure, we leveraged container...
The current High Rate Analysis platform that is being implemented offers a general purpose environment where analyzers can scale up computations within the size of the instantiated cluster that possibly scale within the provider. However, in order to handle potentially a huge amount of users with diverse use cases, we plan to evolve the general purpose infrastructure toward the offloading...
Gravitational waves are perturbations of spacetime that propagate out through the Universe at the speed of light.
The Laser Interferometer Space Antenna (LISA) will be the first space-based observatory to survey the source-rich milliHertz band of the gravitational-wave spectrum.
LISA will revolutionize our understanding of the Universe, providing observations of astrophysical sources ranging...
Gravitational waves are perturbations of spacetime that propagate out through the Universe at the speed of light.
The Laser Interferometer Space Antenna (LISA) will be the first space-based observatory to survey the source-rich milliHertz band of the gravitational-wave spectrum.
LISA will revolutionize our understanding of the Universe, providing observations of astrophysical sources...
A multi-TeV muon collider proves to be very efficient not only for the search for new heavy neutral particles, but also for the discovery of charged bosons of the W′ type. We find that, by analyzing the associated production with a Standard Model W, charged resonances can be probed directly up to multi-TeV mass values close to the collision energy, and for very small couplings with the SM...
Gravitational waves (GWs) from compact binary coalescences can be used as a new and independent cosmological probe if external binary redshift information is injected into the inference process. Methods for incorporating redshift information range from direct detection of electromagnetic counterparts ("bright sirens") to statistical inference of binary redshift using a catalog of possible...
Complex systems refer to a class of interconnected entities or components whose collective behavior cannot be easily deduced from the properties of their individual parts. These systems are characterized by a high degree of interdependence, non-linear relationships, and emergent properties that arise from the interactions and feedback loops among the elements. Our primary focus will be on the...
The ever-growing demand for fast processing of large datasets, as in the upcoming high-luminosity phases at the Large Hadron Collider (LHC), paves the way for innovative approaches. Leveraging the ICSC cloud DataLake model and integrating ongoing experiences in High Energy Physics (HEP) , a path towards an Analysis Facility (AF) is being forged. This new paradigm of data analysis moves from a...
Numerical simulations of lattice Quantum ChromoDynamics offer a non perturbative approach from first principles to compute the properties of the theory of strong interactions. The design of efficient algorithms and the increasing computing power of latest and future generation HPC systems allow to push simulations to more interesting (and challenging) regimes. Within the context of the...
The Milano-Bicocca FPGA cluster represents an incredible opportunity to explore different use cases for the Physics program of the CMS experiment upgrade foreseen for the High Luminosity LHC. In this talk I will describe the FPGA Cluster that will be built in Milano-Bicocca in the context of the ICSC Spoke-2 project, and describe one of the many physics use case that may benefit from an FPGA...