High-energy physics experiments at the Large Hadron Collider (LHC) at CERN rely on simulations to model particle interactions and understand experimental data. These simulations, crucial for reconstructing collision events, are traditionally performed using Monte Carlo-based methods, which are highly computationally demanding. With hundreds of thousands of CPU cores dedicated to these tasks...
The application of machine learning techniques in particle physics has accelerated the development of methodologies for exploring physics beyond the Standard Model. This talk will present an overview of anomaly detection and its potential to enhance the detection of new physics. The talk will discuss the adaptation and real-time deployment of anomaly detection algorithms. Additionally, a novel...
The TimeSPOT project has developed innovative sensors optimized for precise space and time measurements of minimum-ionizing particles in high-radiation environments. These sensors demonstrate exceptional spatial resolution (around 10 µm) and time resolution (around 10 ps), while withstanding high fluences (> 10¹⁷ 1 MeV n_eq/cm²). Tests on small-scale structures confirm their potential for...
Galaxy formation is a complex problem that links large-scale cosmology with small-scale astrophysics over cosmic timescales. The most principled method, full hydrodynamical simulations, come with high computational costs and thus, the development of faster models is essential. Modern field level emulation techniques leverage Convolutional Neural Networks (CNNs) to "paint" baryonic channels...
Understanding the population properties of double white dwarfs (DWDs) in the Milky Way is a key science goal for the upcoming gravitational wave detector, LISA. However, the vast number of galactic binaries (~$30 \times 10^6$) and the large data size (~$6 \times 10^6$) pose significant challenges for traditional Bayesian samplers. In this talk, I present a simulation-based inference framework...
The identification of burst gravitational wave signals can be challenging due to the lack of well-defined waveform models for various source types. In this study, we propose a novel approach to understanding the mass dynamics of the system that produced the burst signal by reconstructing the possible motions of masses that could generate the detected waveform within certain constraints.
Our...
Gravitational wave astronomy in the era of third-generation (3G) detectors will pose significant computational challenges. While standard parameter estimation methods may remain technically feasible, the demand for more efficient inference algorithms is on the rise. We present a sequential neural simulation-based inference algorithm that merges neural ratio estimation (NRE) with nested...
Axion-like particles (ALPs) appear in various extensions of the Standard Model and can interact with photons, leading to ALP-photon conversions in external magnetic fields. This phenomenon can introduce characteristic energy-dependent “wiggles” in gamma-ray spectra. The Cherenkov Telescope Array Observatory (CTAO) is the next-generation ground-based gamma-ray observatory, designed to enhance...
Molecular dynamics (MD) simulations are a fundamental tool for investigating the atomistic behavior of complex systems, offering deep insights into reaction mechanisms, phase transitions, and emergent properties in both condensed and soft matter. Recent advances in machine learning (ML) have determined a paradigm shift in atomistic simulations, allowing the development of force-fields that...
One of the main challenges in solving quantum many-body (MB) problems is the exponential growth of the Hilbert space with system size.
In this regard, a new promising alternative are neural-network quantum states (NQS).
This approach leverages the parameterization of the wave function with neural-network architectures.
Compared to other variational methods, NQS are highly scalable with...
The SHiP experiment is a proposed fixed-target experiment at the CERN SPS aimed at searching for feebly interacting particles beyond the Standard Model. One of its main challenges is reducing the large number of muons produced in the beam dump, which would otherwise create significant background in the detector. The muon shield, a system of magnets designed to deflect muons away from the...
Traditional gradient-based optimization and statistical inference methods often rely on differentiable models, making it challenging to optimize models with non-differentiable components. In this talk, I’ll introduce Learning the Universe by Learning to Optimize (LULO), a novel deep learning-based framework designed to fit non-differentiable simulators at non-linear scales to data. By...
We extend the Particle-flow Neural Assisted Simulations (Parnassus) framework of fast simulation and reconstruction to entire collider events. In particular, we use two generative Artificial Intelligence (genAI) tools, conditional flow matching and diffusion models, to create a set of reconstructed particle-flow objects conditioned on stable truth-level particles from CMS Open Simulations....
The Pixel Vertex Detector (PXD) is the innermost detector of the Belle II experiment. Information from the PXD, together with data from other detectors, allows to have a very precise vertex reconstruction. The effect of beam background on reconstruction is studied by adding measured or simulated background hit patterns to hits produced by simulated signal particles. This requires a huge sample...
We present a novel method for pile-up removal of pp interactions using variational inference with diffusion models, called Vipr. Instead of using classification methods to identify which particles are from the primary collision, a generative model is trained to predict the constituents of the hard-scatter particle jets with pile-up removed. This results in an estimate of the full posterior...
In experimental particle physics, the development of analyses depends heavily on the accurate simulation of background processes, including both the particle collisions/decays and their subsequent interactions with the detector. However, for any specific analysis, a large fraction of these simulated events is discarded by a selection tailored to identifying interesting events to study. At...
In varying action parameters in a lattice gauge theory towards a critical point, such as the continuum limit, generic Markov chain Monte Carlo algorithms incur dramatic sampling penalties. Proof-of-principle studies in applying flow-based generative models to lattice gauge theories have suggested that such methods can mitigate against critical slowing down and topological freezing. There...
The simulation of calorimeter showers is computationally expensive, leading to the development of generative models as an alternative. Many of these models face challenges in balancing generation quality and speed. A key issue damaging the simulation quality is the inaccurate modeling of distribution tails. Normalizing flow (NF) models offer a trade-off between accuracy and speed, making them...
The density matrix of a quantum system provides complete information about its entanglement. Using generative autoregressive networks, we show how to estimate the matrix elements for the small quantum spin chain. Using a density matrix, we calculate Renyi entanglement entropies as well as Shanon entropy at zero temperature.
Recent advances in generative models have demonstrated the potential of normalizing flows for lattice field theory, particularly in mitigating critical slowing down and improving sampling efficiency. In this talk, I will discuss the role of continuous normalizing flows (CNF) or neural ODEs in learning field theories, highlighting their advantages and challenges compared to discrete flow...
Knowledge of the primordial matter density field from which the present non-linear observations formed is of fundamental importance for cosmology, as it contains an immense wealth of information about the physics, evolution, and initial conditions of the universe. Reconstructing this density field from galaxy survey data is a notoriously difficult task, requiring sophisticated statistical...
Fast radio bursts (FRBs) are extremely brief and bright flashes of radio waves originating from distant galaxies. Localizing FRBs to or within a host galaxy is key to exploring their physical origin(s) and using them as cosmological probes. However, poor uv-coverage of interferometric arrays and susceptibility to calibration errors can make FRBs exceptionally hard to localize accurately. I...
How much cosmological information can we reliably extract from existing and upcoming large-scale structure observations? Many summary statistics fall short in describing the non-Gaussian nature of the late-time Universe in comparison to existing and upcoming measurements. We demonstrate that we can identify optimal summary statistics and that we can link them with existing summary statistics....
The formation of the first galaxies was a pivotal period in cosmic history that ended the cosmic dark ages and paved the way for present-day galaxies such as our Milky Way. This period, characterised by distinct conditions—such as the absence of crucial metals necessary for efficient gas cooling—poses a frontier in cosmology and astrophysics, offering opportunities to discover novel physics....
Simulations play a crucial role in understanding the complex dynamics of particle collisions at CERN’s Large Hadron Collider (LHC). Traditionally, Monte Carlo-based simulations have been the primary tool for modeling these interactions, but their high computational cost presents significant challenges. Recently, generative machine learning models have emerged as an efficient alternative,...
Generative models based on diffusion processes have recently emerged as powerful tools in artificial intelligence, enabling high-quality sampling in a variety of domains. In this work, we propose a novel hybrid quantum-classical diffusion model, where artificial neural networks are replaced with parameterized quantum circuits to directly generate quantum states. To overcome the limitations of...
Recent advances in machine learning have unlocked transformative approaches to longstanding challenges in fundamental physics. In this talk, I will present our latest work that harnesses physics‐driven deep learning to tackle two intertwined frontiers: solving inverse problems in Quantum Chromodynamics (QCD) and deploying generative models for statistical physics and field theory.
Inverse...
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields
state-of-the-art performance for a wide range of machine learning tasks at the Large
Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is
equivariant under Lorentz transformations. The underlying architecture is a versatile
and scalable transformer, which is able to break...
Many scientific and engineering problems are fundamentally linked to geometry, for example, designing a part to maximise strength or modelling fluid flow around an airplane wing. Thus, there is substantial interest in developing machine learning models that can not only operate on or output geometric data, but generate new geometries. Such models have the potential to revolutionise advanced...
Jet constituents provide a more detailed description of the radiation pattern within a jet compared to observables summarizing global jet properties. In Run 2 analyses at the LHC using the ATLAS detector, transformer-based taggers leveraging low-level variables outperformed traditional approaches based on high-level variables and conventional neural networks in distinguishing quark- and...
In this work we consider the problem of determining the identity of hadrons at high energies based on the topology of their energy depositions in dense matter, along with the time of the interactions. Using GEANT4 simulations of a homogeneous lead tungstate calorimeter with high transverse and longitudinal segmentation, we investigated the discrimination of protons, positive pions, and...
In recent years, disparities have emerged within the context of the concordance model regarding the estimated value of the Hubble constant H0 [1907.10625] using Cosmic Microwave Background (CMB) and Supernovae data (commonly referred to as the Hubble tension), the clustering σ8 [1610.04606] using CMB and weak lensing, and the curvature ΩK [1908.09139, 1911.02087] using CMB and lensing/BAO, and...
We present a machine learning approach using normalizing flows for inferring cosmological parameters
from gravitational wave events. Our methodology is general to any type of compact binary coalescence
event and cosmological model and relies on the generation of training data representing distributions of
gravitational wave event parameters. These parameters are conditional on the...
The 21 cm signal from neutral hydrogen is a key probe of the Epoch of Reionization (EoR), marking the universe’s transition from a cold, neutral state to a predominantly hot, ionized one, driven by the formation of the first stars and galaxies. Extracting this faint 21 cm signal from radio interferometric data requires precise gain calibration. However, traditional calibration methods are...
In this talk I will introduce a new paradigm for cosmological inference, enabled by recent advances in machine learning and its underlying technology. By combining emulation, differentiable and probabilistic programming, scalable gradient-based sampling, and decoupled Bayesian model selection, this framework scales to extremely high-dimensional parameter spaces and enables complete Bayesian...
A major challenge in both simulation and inference within astrophysics is the lack of a reliable prior model for galaxy morphology. Existing galaxy catalogs are heterogeneous and provide an impure representation of underlying galaxy structures due to instrument noise, blending, and other observational limitations. Consequently, priors on galaxy morphology typically rely on either simplistic...
The cosmic dawn (CD) of the first luminous objects and eventual reionisation (EoR) of the intergalactic medium (IGM) remain among the greatest mysteries in modern cosmology. The 21-cm line is one of the most powerful probes of these crucial moments in the history of the Universe, providing a clean window into both cosmology and astrophysics. Current 21-cm observations are upper limits on the...
Modeling the distribution of neutral hydrogen is essential for understanding the physics of structure formation and the nature of dark matter, but accurate numerical simulations are computationally expensive. We describe a novel Variational Diffusion Model (VDM), built on a 3D CNN attention U-Net architecture, which we use in concert with the CAMELS simulation suite to generate accurate 21 cm...
Understanding the properties of galaxy populations and their evolution is directly linked to the success of large-scale surveys such as The Vera C. Rubin Observatory's Legacy Survey of Space and Time (LSST). Galaxy spectral energy densities (SEDs) encode these properties, but SED observations for a broad wavelength range via spectroscopy is a time consuming practice. LSST will perform...
Simulating showers of particles in highly-granular detectors is a key frontier in the application of machine learning to particle physics. Achieving high accuracy and speed with generative machine learning models can enable them to augment traditional simulations and alleviate a major computing constraint.
Recent developments have shown how diffusion based generative shower simulation...
Simulation-based inference (SBI) allows amortized Bayesian inference for simulators with implicit likelihoods. However, some explicit likelihoods cannot easily be reformulated as simulators, hindering its integration into combined analyses within the SBI framework. One key example in cosmology is given by the Planck CMB likelihoods. In this talk, I will present a simple method to construct an...
The next generation of tracking detectors at upcoming and future high luminosity hadron colliders will be operating under extreme radiation levels with an unprecedented number of track hits per proton-proton collision that can only be processed if precise timing information is made available together with state-of-the-art spatial resolution. 3D Diamond pixel sensors are considered as a...
Black holes represent some of the most extreme environments in the universe, spanning vast ranges in mass, size, and energy output. Observations from the Event Horizon Telescope (EHT) have provided an unprecedented opportunity to directly image black holes, with future plans aiming to create time-resolved movies of their evolution. To fully leverage these observations, we need theoretical...