16–20 Jun 2025
THotel, Cagliari, Sardinia, Italy
Europe/Rome timezone

Session

🔀 Explainability & Theory

17 Jun 2025, 12:00
THotel, Cagliari, Sardinia, Italy

THotel, Cagliari, Sardinia, Italy

Via dei Giudicati, 66, 09131 Cagliari (CA), Italy

Conveners

🔀 Explainability & Theory

  • Tilman Plehn (Heidelberg University, ITP)

🔀 Explainability & Theory

  • Roberto Ruiz de Austri (IFIC UV-CSIC)

🔀 Explainability & Theory

  • Andreas Ipp (TU Wien, Austria)

Presentation materials

There are no materials yet.

  1. Yanick Thurn (Deutsch)
    17/06/2025, 12:00
    Explainability & Theory
    Poster Session A

    An important challenge in machine learning is to predict the initial conditions under which a given neural network will be trainable. We present a method for predicting the trainable regime in parameter space for deep feedforward neural networks (DNNs) based on reconstructing the input from subsequent activation layers via a cascade of single-layer auxiliary networks. We show that a single...

    Go to contribution page
  2. Nachiketa Chakraborty (University of Reading)
    17/06/2025, 12:03
    Explainability & Theory
    Poster Session A

    Astrophysical sources vary across vast timescales, providing insight into extreme dynamical phenomena, from solar outbursts to distant AGNs and GRBs. These time-varying processes are often complex, nonlinear, and non-Gaussian, making it difficult to disentangle underlying causal mechanisms, which may act simultaneously or sequentially. Using solar variability and AGNs as examples, we...

    Go to contribution page
  3. Harry Bevins (University of Cambridge)
    17/06/2025, 12:06
    Explainability & Theory
    Parallel talk

    Neural network emulators or surrogates are widely used in astrophysics and cosmology to approximate expensive simulations, accelerating both likelihood-based inference and training for simulation-based inference. However, emulator accuracy requirements are often justified heuristically rather than with rigorous theoretical bounds. We derive a principled upper limit on the information loss...

    Go to contribution page
  4. Lorenzo Colantonio (Sapienza University of Rome and INFN Rome)
    17/06/2025, 12:26
    Explainability & Theory
    Parallel talk

    The graph coloring problem is an optimization problem involving the assignment of one of q colors to each vertex of a graph such that no two adjacent vertices share the same color. This problem is computationally challenging and arises in several practical applications. We present a novel algorithm that leverages graph neural networks to tackle the problem efficiently, particularly for large...

    Go to contribution page
  5. Marvin Kohls (GSI Helmholtzzentrum für Schwerionenforschung GmbH), Dr Simon Spies (Goethe University Frankfurt)
    17/06/2025, 12:46
    Explainability & Theory
    Poster Session A

    In this conference contribution, we present our findings on applying Artificial Neural Networks (ANNs) to enhance off-vertex topology recognition using data from the HADES experiment at GSI, Darmstadt. Our focus is on decays of $\Lambda$ and K$^0_{\text{S}}$ particles produced in heavy ion as well as elementary reactions. We demonstrate how ANNs can enhance the separation of weak decays from...

    Go to contribution page
  6. Ms Shiqi Su (Department of Physics and Astronomy, University of Leicester)
    17/06/2025, 12:49
    Explainability & Theory
    Poster Session A

    The adoption of AI-based techniques in theoretical research is often slower than in other fields due to the perception that AI-based methods lack rigorous validation against theoretical counterparts. In this talk, we introduce COEmuNet, a surrogate model designed to emulate carbon monoxide (CO) line radiation transport in stellar atmospheres.

    COEmuNet is based on a three-dimensional...

    Go to contribution page
  7. Dr Darius Jurčiukonis (Vilnius University (LT))
    18/06/2025, 16:30
    Explainability & Theory
    Poster Session B

    Machine learning techniques are used to predict theoretical constraints—such as unitarity, boundedness from below, and the potential minimum—in multi-scalar models. This approach has been demonstrated to be effective when applied to various extensions of the Standard Model that incorporate additional scalar multiplets. A high level of predictivity is achieved through appropriate neural network...

    Go to contribution page
  8. Leonora Kardum
    18/06/2025, 16:33
    Explainability & Theory
    Poster Session B

    Today, many physics experiments rely on Machine Learning (ML) methods to support their data analysis pipelines. Although ML has revolutionized science, most models are still difficult to interpret and lack clarity of the process with which they calculate results and the way they utilize information from used datasets. In this work, we introduce physics-guided ML methods that keep the...

    Go to contribution page
  9. Andreas Ipp (TU Wien, Austria)
    18/06/2025, 16:36
    Explainability & Theory
    Parallel talk

    Extracting continuum properties from discretized quantum field theories is significantly hindered by lattice artifacts. Fixed-point (FP) actions, defined via renormalization group transformations, offer an elegant solution by suppressing these artifacts even on coarse lattices. In this work, we employ gauge-covariant convolutional neural networks to parameterize an FP action for...

    Go to contribution page
  10. Dr Donatella Genovese (Sapienza Università di Roma)
    18/06/2025, 16:56
    Explainability & Theory
    Parallel talk

    The Large Hadron Collider (LHC) at CERN generates vast amounts of data from high-energy particle collisions, requiring advanced machine learning techniques for effective analysis. While Graph Neural Networks (GNNs) have demonstrated strong predictive capabilities in high-energy physics (HEP) applications, their "black box" nature often limits interpretability. To address this challenge, we...

    Go to contribution page
  11. Mariagrazia Monteleone (Politecnico di Milano)
    18/06/2025, 17:16
    Explainability & Theory
    Poster Session B

    Background: In High Energy Physics (HEP), jet tagging is a fundamental classification task that has been extensively studied using deep learning techniques. Among these, transformer networks have gained significant popularity due to their strong performance and intrinsic attention mechanisms. Furthermore, pre-trained transformer models are available for a wide range of classification...

    Go to contribution page
  12. Guillermo Hijano (University of Zurich)
    18/06/2025, 17:19
    Explainability & Theory
    Poster Session B

    Experimental studies of 𝑏-hadron decays face significant challenges due to a wide range of backgrounds arising from the numerous possible decay channels with similar final states. For a particular signal decay, the process for ascertaining the most relevant background processes necessitates a detailed analysis of final state particles, potential misidentifications, and kinematic overlaps...

    Go to contribution page
  13. Mirko Bunse (Lamarr Institute for Machine Learning and Artificial Intelligence, Dortmund, Germany)
    19/06/2025, 16:15
    Explainability & Theory
    Parallel talk

    The resolution of any detector is finite, leading to distortions in the measured distributions. Within physics research, the indispensable correction of these distortions is know as Unfolding. Machine learning research uses a different term for this very task: Quantification Learning. For the past two decades, this difference in terminology (and some differences in notation) have prevented...

    Go to contribution page
  14. Will Handley
    19/06/2025, 16:35
    Explainability & Theory
    Parallel talk

    https://arxiv.org/abs/2501.03921

    Simulation-based inference is undergoing a renaissance in statistics and machine learning. With several packages implementing the state-of-the-art in expressive AI [mackelab/sbi] [undark-lab/swyft], it is now being effectively applied to a wide range of problems in the physical sciences, biology, and beyond.

    Given the rapid pace of AI/ML, there is little...

    Go to contribution page
Building timetable...