Conveners
🔀 Inference & Uncertainty
- Andreas Ipp (TU Wien, Austria)
🔀 Inference & Uncertainty
- Heng Ik Siong (University of Glasgow)
🔀 Inference & Uncertainty
- Johan Messchendorp (GSI/FAIR GmbH)
🔀 Inference & Uncertainty
- Daniele Bonacorsi (Istituto Nazionale di Fisica Nucleare)
Current studies of the hadron spectrum are limited by the accuracy and consistency of datasets. Information derived from theory models often requires fits to measurements taken at specific values of kinematic variables, which needs interpolation between such points. In sparse datasets the quantification of uncertainties is problematic. Machine Learning is a powerful tool that can be used to...
The Fair Universe project organised the HiggsML Uncertainty Challenge, which took place from Sep 2024 to 14th March 2025. This groundbreaking competition in high-energy physics (HEP) and machine learning was the first to place a strong emphasis on uncertainties, focusing on mastering both the uncertainties in the input training data and providing credible confidence intervals in the...
Over the past 16 years, the Fermi Large Area Telescope (LAT) has significantly advanced our view of the GeV gamma-ray sky, yet several key questions remain - such as the nature of the isotropic diffuse background, the properties of the Galactic pulsar population, and the origin of the GeV excess towards the Galactic Centre. Addressing these challenges requires sophisticated astrophysical...
Neural simulation-based inference is a powerful class of machine-learning-based methods for statistical inference that naturally handles high-dimensional parameter estimation without the need to bin data into low-dimensional summary histograms. Such methods are promising for a range of measurements, including at the Large Hadron Collider, where no single observable may be optimal to scan over...
The Einstein Telescope (ET), along with other third-generation gravitational wave (GW) detectors, will be a key instrument for detecting GWs in the coming decades. However, analyzing the data and estimating source parameters will be challenging, especially given the large number of expected detections – of order 10^5 per year – which makes current methods based on stochastic sampling...
Deep generative models have become powerful tools for alleviating the computational burden of traditional Monte Carlo generators in producing high-dimensional synthetic data. However, validating these models remains challenging, especially in scientific domains requiring high precision, such as particle physics. Two-sample hypothesis testing offers a principled framework to address this task....
How much cosmological information can we reliably extract from existing and upcoming large-scale structure observations? Many summary statistics fall short in describing the non-Gaussian nature of the late-time Universe in comparison to existing and upcoming measurements. We demonstrate that we can identify optimal summary statistics and that we can link them with existing summary statistics....
In High Energy Physics, when testing theoretical models of new physics against experimental results, the customary approach is to simply sample random points from the parameter space of the model, calculate their predicted values for the desired observables and compare them to experimental data. However, due to the typically large number of parameters in these models, this process is highly...
The high-luminosity era of the LHC will offer greatly increased number of events for more precise Standard Model measurements and Beyond Standard Model searches, but will also pose unprecedented challenges to the detectors. To meet these challenges, the CMS detector will undergo several upgrades, including the replacement of the current endcap calorimeters with a novel High-Granularity...
Axion-like particles (ALPs) appear in various extensions of the Standard Model and can interact with photons, leading to ALP-photon conversions in external magnetic fields. This phenomenon can introduce characteristic energy-dependent “wiggles” in gamma-ray spectra. The Cherenkov Telescope Array Observatory (CTAO) is the next-generation ground-based gamma-ray observatory, designed to enhance...
Nested Sampling is a Monte Carlo method that performs parameter estimation and model comparison robustly for a variety of high dimension and complicated distributions. It has seen widespread usage in the physical sciences, however in recent years increasingly it is viewed as part of a legacy code base, with GPU native paradigms such as neural simulation based inference coming to the fore. In...
The Laser Interferometer Space Antenna (LISA) will provide an unprecedented window into the gravitational wave sky. However, it also presents a serious data analysis challenge to separate and classify various classes of deterministic sources, instrumental noise, and potential stochastic backgrounds. This "global fit" problem presents an extremely high-dimensional inference task that sits right...
Third-generation (3G) gravitational wave (GW) observatories will unveil a cosmic orchestra, detecting thousands of sources annually. However, their increased detection rate poses a major challenge for data analysis. Existing, widely used techniques to obtain the source parameters are prohibitively expensive, creating a bottleneck for extracting scientific insights from 3G detector data. We...
Simulation-based inference (SBI) has emerged as a powerful tool for parameter estimation, particularly in complex scenarios where traditional Bayesian methods are computationally intractable. In this work, we build upon a previous application of SBI, based on truncated neural posterior estimation (TNPE), to estimate the parameters of a gravitational wave post-merger signal, using real data...
Bayesian inference is essential for understanding the compact binaries that produce gravitational waves detected by the LIGO-Virgo-KAGRA collaboration. Performing this inference is computationally expensive and often has to be repeated multiple times with different models, e.g. different approximations of General Relativity. These repeated analyses always start from scratch, which is highly...
Recent cosmological surveys have opened a new window onto the nature of dark energy. In our work we reconstruct the dark energy equation of state using a “flexknot” parameterisation that represents $w(a)$ as a linear spline with free–moving nodes. By combining the latest DESI Baryonic Acoustic Oscillation measurements with Pantheon+ supernovae data—and cross–checking our results with an...
The Pierre Auger Observatory is a cosmic-ray detector that uses multiple systems to simultaneously observe extensive air showers (EAS). EAS are particle cascades initiated by ultra-high-energy cosmic rays (UHECRs) interacting with the atmosphere of the Earth. Determining the sources of UHECRs requires precise knowledge of their mass composition. One key observable for estimating the mass of an...
Simulation-based inference (SBI) has seen remarkable development in recent years and has found widespread application across a range of physical sciences. A defining characteristic of SBI is its ability to perform likelihood-free inference, relying on simulators rather than explicit likelihood functions. Several representative methods have emerged within this framework, such as Approximate...
The 2030s are anticipated to be a golden era in ground-based gravitational wave astronomy, with the advent of next-generation observatories such as the Einstein Telescope and Cosmic Explorer set to revolutionize our understanding of the universe. However, this unprecedented sensitivity and observational depth will come with a significant increase in the computational demands of...
In the next decade, the third generation of ground-based gravitational wave detectors, such as the european Einstein Telescope, is expected to revolutionize our understanding of compact binary mergers. With a 10 factor improvement in sensitivity and an extended range towards lower frequencies, Einstein Telescope will enable the detection of longer-duration signals from binary black hole and...
High-Energy Physics experiments are rapidly escalating in generated data volume, a trend that will intensify with the upcoming High-Luminosity LHC upgrade. This surge in data necessitates critical revisions across the data processing pipeline, with particle track reconstruction being a prime candidate for improvement. In our previous work, we introduced "TrackFormers", a collection of...
The Matrix Element Method (MEM) is a powerful technique for computing the event-by-event likelihood for a given theory hypothesis, simultaneously accounting for both detector effects and theoretical models. Despite its strong theoretical foundation, MEM is seldom used in analyses involving final states with many jets due to the complex, multi-dimensional integrals required to accurately model...
Alpha Magnetic Spectrometer (AMS-02) is a precision high-energy cosmic-ray experiment on the ISS operating since 2011 and has collected more than 228 billion particles. Among them, positrons are important to understand the particle nature of dark matter. Separating the positrons from cosmic background protons is challenging above 1 TeV. Therefore, we use state-of-the-art convolutional and...
The WASA-FRS HypHI Experiment focuses on the study of light hypernuclei by means of heavy-ion induced reactions in 6Li collisions with 12C at 1.96GeV/u. It is part of the WASA-FRS experimental campaign, and so is the eta-prime experiment [1]. The distinctive combination of the high-resolution spectrometer FRagment Separator (FRS) [2] and the high-acceptance detector system WASA [3] is used....
Analyzing irregular and sparse time-series is a widespread problem in fundamental physics, astronomy, climate science and many other fields. This talk utilizes a novel Transformer-based architecture for multi-dimensional irregular time-series and sparse data, designed as a foundation model for general time-series interpolation, object classification, and representation learning. Our method...
Gravitational wave (GW) interferometers, such as LIGO, Virgo, and KAGRA, detect faint signals from distant astrophysical events. However, their high sensitivity also makes them susceptible to background noise, which can obscure these signals. This noise often includes transient artifacts called ``glitches", that can mimic genuine astrophysical signals or mask their true characteristics. Fast...
At the Phase-2 Upgrade of the CMS Level-1 Trigger (L1T), particles will be reconstructed by linking charged particle tracks with clusters in the calorimeters and muon tracks from the muon station. The 200 pileup interactions will be mitigated using primary vertex reconstruction for charged particles and a weighting for neutral particles based on the distribution of energy in a small area. Jets...