Conveners
🔀 Simulations & Generative Models
- James Alvey (University of Cambridge)
🔀 Simulations & Generative Models
- Sven Krippendorf (DAMTP Cambridge)
🔀 Simulations & Generative Models
- Tilman Plehn (Heidelberg University, ITP)
🔀 Simulations & Generative Models
- Tommaso Dorigo (INFN Padova, Luleå University of Technology, MODE Collaboration, Universal Scientific Education and Research Network)
🔀 Simulations & Generative Models
- Vilius Cepaitis (University of Geneva)
🔀 Simulations & Generative Models
- Roberto Ruiz de Austri (IFIC UV-CSIC)
The TimeSPOT project has developed innovative sensors optimized for precise space and time measurements of minimum-ionizing particles in high-radiation environments. These sensors demonstrate exceptional spatial resolution (around 10 µm) and time resolution (around 10 ps), while withstanding high fluences (> 10¹⁷ 1 MeV n_eq/cm²). Tests on small-scale structures confirm their potential for...
Galaxy formation is a complex problem that links large-scale cosmology with small-scale astrophysics over cosmic timescales. The most principled method, full hydrodynamical simulations, come with high computational costs and thus, the development of faster models is essential. Modern field level emulation techniques leverage Convolutional Neural Networks (CNNs) to "paint" baryonic channels...
Understanding the population properties of double white dwarfs (DWDs) in the Milky Way is a key science goal for the upcoming gravitational wave detector, LISA. However, the vast number of galactic binaries (~$30 \times 10^6$) and the large data size (~$6 \times 10^6$) pose significant challenges for traditional Bayesian samplers. In this talk, I present a simulation-based inference framework...
The application of machine learning techniques in particle physics has accelerated the development of methodologies for exploring physics beyond the Standard Model. This talk will present an overview of anomaly detection and its potential to enhance the detection of new physics. The talk will discuss the adaptation and real-time deployment of anomaly detection algorithms. Additionally, a novel...
One of the main challenges in solving quantum many-body (MB) problems is the exponential growth of the Hilbert space with system size.
In this regard, a new promising alternative are neural-network quantum states (NQS).
This approach leverages the parameterization of the wave function with neural-network architectures.
Compared to other variational methods, NQS are highly scalable with...
Traditional gradient-based optimization and statistical inference methods often rely on differentiable models, making it challenging to optimize models with non-differentiable components. In this talk, I’ll introduce Learning the Universe by Learning to Optimize (LULO), a novel deep learning-based framework designed to fit non-differentiable simulators at non-linear scales to data. By...
We extend the Particle-flow Neural Assisted Simulations (Parnassus) framework of fast simulation and reconstruction to entire collider events. In particular, we use two generative Artificial Intelligence (genAI) tools, conditional flow matching and diffusion models, to create a set of reconstructed particle-flow objects conditioned on stable truth-level particles from CMS Open Simulations....
We present a novel method for pile-up removal of pp interactions using variational inference with diffusion models, called Vipr. Instead of using classification methods to identify which particles are from the primary collision, a generative model is trained to predict the constituents of the hard-scatter particle jets with pile-up removed. This results in an estimate of the full posterior...
In experimental particle physics, the development of analyses depends heavily on the accurate simulation of background processes, including both the particle collisions/decays and their subsequent interactions with the detector. However, for any specific analysis, a large fraction of these simulated events is discarded by a selection tailored to identifying interesting events to study. At...
The SHiP experiment is a proposed fixed-target experiment at the CERN SPS aimed at searching for feebly interacting particles beyond the Standard Model. One of its main challenges is reducing the large number of muons produced in the beam dump, which would otherwise create significant background in the detector. The muon shield, a system of magnets designed to deflect muons away from the...
Recent advances in generative models have demonstrated the potential of normalizing flows for lattice field theory, particularly in mitigating critical slowing down and improving sampling efficiency. In this talk, I will discuss the role of continuous normalizing flows (CNF) or neural ODEs in learning field theories, highlighting their advantages and challenges compared to discrete flow...
Knowledge of the primordial matter density field from which the present non-linear observations formed is of fundamental importance for cosmology, as it contains an immense wealth of information about the physics, evolution, and initial conditions of the universe. Reconstructing this density field from galaxy survey data is a notoriously difficult task, requiring sophisticated statistical...
The interTwin project develops an open-source Digital Twin Engine to integrate application-specific Digital Twins (DTs) across scientific domains. Its framework for the development of DTs supports interoperability, performance, portability and accuracy. As part of this initiative, we implemented the CaloINN normalizing-flow model for calorimeter simulations within the interTwin framework....
Generative models based on diffusion processes have recently emerged as powerful tools in artificial intelligence, enabling high-quality sampling in a variety of domains. In this work, we propose a novel hybrid quantum-classical diffusion model, where artificial neural networks are replaced with parameterized quantum circuits to directly generate quantum states. To overcome the limitations of...
The density matrix of a quantum system provides complete information about its entanglement. Using generative autoregressive networks, we show how to estimate the matrix elements for the small quantum spin chain. Using a density matrix, we calculate Renyi entanglement entropies as well as Shanon entropy at zero temperature.
Recent advances in machine learning have unlocked transformative approaches to longstanding challenges in fundamental physics. In this talk, I will present our latest work that harnesses physics‐driven deep learning to tackle two intertwined frontiers: solving inverse problems in Quantum Chromodynamics (QCD) and deploying generative models for statistical physics and field theory.
Inverse...
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields
state-of-the-art performance for a wide range of machine learning tasks at the Large
Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is
equivariant under Lorentz transformations. The underlying architecture is a versatile
and scalable transformer, which is able to break...
Many scientific and engineering problems are fundamentally linked to geometry, for example, designing a part to maximise strength or modelling fluid flow around an airplane wing. Thus, there is substantial interest in developing machine learning models that can not only operate on or output geometric data, but generate new geometries. Such models have the potential to revolutionise advanced...
Jet constituents provide a more detailed description of the radiation pattern within a jet compared to observables summarizing global jet properties. In Run 2 analyses at the LHC using the ATLAS detector, transformer-based taggers leveraging low-level variables outperformed traditional approaches based on high-level variables and conventional neural networks in distinguishing quark- and...
It was recently demonstrated [1] that brain networks and the cosmic web share key structural features—such as degree distributions, path lengths, modularity, and information density. Inspired by this work, we apply AI-based methods to study the geometry of neuronal networks formed by isolated brain neurons in culture, with a focus on the spontaneous formation of dendritic lattices and noted...
In recent years, disparities have emerged within the context of the concordance model regarding the estimated value of the Hubble constant H0 [1907.10625] using Cosmic Microwave Background (CMB) and Supernovae data (commonly referred to as the Hubble tension), the clustering σ8 [1610.04606] using CMB and weak lensing, and the curvature ΩK [1908.09139, 1911.02087] using CMB and lensing/BAO, and...
We present a machine learning approach using normalizing flows for inferring cosmological parameters
from gravitational wave events. Our methodology is general to any type of compact binary coalescence
event and cosmological model and relies on the generation of training data representing distributions of
gravitational wave event parameters. These parameters are conditional on the...
In this talk I will introduce a new paradigm for cosmological inference, enabled by recent advances in machine learning and its underlying technology. By combining emulation, differentiable and probabilistic programming, scalable gradient-based sampling, and decoupled Bayesian model selection, this framework scales to extremely high-dimensional parameter spaces and enables complete Bayesian...
A major challenge in both simulation and inference within astrophysics is the lack of a reliable prior model for galaxy morphology. Existing galaxy catalogs are heterogeneous and provide an impure representation of underlying galaxy structures due to instrument noise, blending, and other observational limitations. Consequently, priors on galaxy morphology typically rely on either simplistic...
Modeling the distribution of neutral hydrogen is essential for understanding the physics of structure formation and the nature of dark matter, but accurate numerical simulations are computationally expensive. We describe a novel Variational Diffusion Model (VDM), built on a 3D CNN attention U-Net architecture, which we use in concert with the CAMELS simulation suite to generate accurate 21 cm...
The Pixel Vertex Detector (PXD) is the innermost detector of the Belle II experiment. Information from the PXD, together with data from other detectors, allows to have a very precise vertex reconstruction. The effect of beam background on reconstruction is studied by adding measured or simulated background hit patterns to hits produced by simulated signal particles. This requires a huge sample...
Advances in Machine Learning, particularly Large Language Models (LLMs), enable more efficient interaction with complex datasets through tokenization and next-token prediction strategies. This talk presents and compares various approaches to structuring particle physics data as token sequences, allowing LLM-inspired models to learn event distributions and detect anomalies via next-token (or...
Simulating showers of particles in highly-granular detectors is a key frontier in the application of machine learning to particle physics. Achieving high accuracy and speed with generative machine learning models can enable them to augment traditional simulations and alleviate a major computing constraint.
Recent developments have shown how diffusion based generative shower simulation...
Simulation-based inference (SBI) allows amortized Bayesian inference for simulators with implicit likelihoods. However, some explicit likelihoods cannot easily be reformulated as simulators, hindering its integration into combined analyses within the SBI framework. One key example in cosmology is given by the Planck CMB likelihoods.
In this talk, I will present a simple method to construct...
The next generation of tracking detectors at upcoming and future high luminosity hadron colliders will be operating under extreme radiation levels with an unprecedented number of track hits per proton-proton collision that can only be processed if precise timing information is made available together with state-of-the-art spatial resolution. 3D Diamond pixel sensors are considered as a...
Black holes represent some of the most extreme environments in the universe, spanning vast ranges in mass, size, and energy output. Observations from the Event Horizon Telescope (EHT) have provided an unprecedented opportunity to directly image black holes, with future plans aiming to create time-resolved movies of their evolution. To fully leverage these observations, we need theoretical...