29 May 2023 to 2 June 2023
Pollica Physics Center
Europe/Rome timezone

Contribution List

9 out of 9 displayed
Export to PDF
  1. Prof. Roberto Bondesan (Imperial College London)

    Quantum error correction is a critical component for scaling up quantum computing. Given a quantum code, an optimal decoder maps the measured code violations to the most likely error that occurred, but its cost scales exponentially with the system size. Neural network decoders are an appealing solution since they can learn from data an efficient approximation to such a mapping and can...

    Go to contribution page
  2. Kim Nicoli (Bonn U.)

    The task of learning non-trivial probability densities is a crucial problem in machine learning with an uncountable number of applications in, among others, computer vision, sound synthesis, text generation, and natural sciences. The subfield of machine learning that leverages deep learning to learn complicated probability distributions and samples from them is known as Generative AI....

    Go to contribution page
  3. Gérard Ben Arous (New York U.)

    I will survey recent progress about the behavior of SGD in high dimensional settings.
    (in joint works with Aukosh Jagannath and Reza Gheissari, and upcoming work with them and Jiaoyang Huang)

    Go to contribution page
  4. Sébastien Racanière (DeepMind)

    Recently, there have been some very impressive advances in generative models for sound, text and images. In this talk, I will look into applications of generative models to Lattice QCD. The models I will consider are flows, which are families of diffeomorphisms transforming simple base distributions into complicated target distributions. I will explain why we believe that flows are suitable...

    Go to contribution page
  5. Vasily Sazonov (CEA-LIST)

    First, I will describe a specific quantum error mitigation scheme designed for parametric circuits accessible by classical computations in some range of their parameters. I will demonstrate the work of the scheme on the example of the 4-spin anti-ferromagnetic Ising model in the transverse field, and discuss possible applications to the sign problem in Monte Carlo simulations.
    Then, I will...

    Go to contribution page
  6. Atakan Hilmi Fırat (MIT; IAIFI)

    String vertices are the geometric ingredient underlying string field theory. In this talk, I will outline the bootstrap formalism for constructing hyperbolic string vertices. The emphasis will be on how machine learning may provide a natural numerical framework to explicitly realize this construction. Based on 2211.09129, 2302.12843.

    Go to contribution page
  7. Prof. Beatriz Seoane (Paris-Saclay University)

    Energy-based models (EBMs) are powerful generative machine learning models that are able to encode the complex distribution of a dataset in the Gibbs-Boltzmann distribution of a model energy function. This means that, if properly trained, they can be used to synthesize new samples that resemble those of the dataset as closely as possible, but also that this energy function can be used to...

    Go to contribution page
  8. Greg Yang (Microsoft Research)

    Recently, the theory of infinite-width neural networks led to the first technology, muTransfer, for tuning enormous neural networks that are too expensive to train more than once. For example, this allowed us to tune the 6.7 billion parameter version of GPT-3 using only 7% of its pretraining compute budget, and with some asterisks, we get a performance comparable to the original GPT-3 model...

    Go to contribution page
  9. Sven Krippendorf (LMU)

    A theory of neural networks (NNs) built upon collective variables would provide scientists with the tools to better understand the learning process at every stage. I argue that a fruitful path for this endeavour of understanding non-linear neural network dynamics is to consider the analogy with physical systems. As an example, I demonstrate that the dynamics of neural networks trained with...

    Go to contribution page