Speaker
Description
The full optimization of the design and operation of instruments whose functioning relies on the interaction of radiation with matter is a super-human task, given the large dimensionality of the space of possible choices for geometry, detection technology, materials, and data-acquisition and information-extraction techniques, and the interdependence of the related parameters. On the other hand, enormous potential gains in performance over standard, "experience-driven" layouts are in principle at reach if an objective function fully aligned with the final goals of the instrument is maximized by a systematic search of the configuration space.
The stochastic nature of the involved quantum processes make the modeling of these systems an intractable problem from a classical statistics point of view, yet the construction of a fully differentiable pipeline and the use of deep learning techniques may allow the simultaneous optimization of all design parameters.
In this presentation I will lay down the plans for the design of a modular and versatile modeling tool for the end-to-end optimization of complex instruments for particle physics experiments as well as industrial and medical applications that share the detection of radiation as their basic ingredient, and show results of the study of a muon tomography use case, to highlight the potential of this approach.
Collaboration | MODE, see https://mode-collaboration.github.io |
---|