Speaker
Description
DUNE is a future liquid argon TPC neutrino oscillation and astrophysical neutrino experiment that will take data at a rate of 30 PB/year. Prototypes running at CERN have already taken data and collaborators are currently analyzing 1 PB of data and 5-6 PB of simulation from the first prototype run using the resources of 48 DUNE institutions.
The DUNE computing system has evolved from the heritage of neutrino and collider experiments based at Fermilab. To achieve the increase in scale required by DUNE it has been necessary to generalise the computing systems to make better use of resources elsewhere in the world, first at CERN and then at institutes in other countries. The integration of UK computing resources into DUNE is an informative use-case of this process.
We describe how DUNE computing in the UK transitioned from ad-hoc support by a few institutes to becoming fully integrated in the UK infrastructure alongside the LHC experiments. This infrastructure is operated by GridPP as part of WLCG and has a mature operations culture, keeping staff at sites and experiments in regular contact. This led to increased use of WLCG-favoured tools like GGUS tickets within DUNE as its own computing operations team grew. DUNE's expansion in the UK also coincided with the UK IRIS project and the start of its formal allocation process for resources provided to non-LHC physics and astronomy projects. This in turn contributed to a more formal description of DUNE's current and projected requirements in the immediate future.
Experience with the constraints and features of sites in the UK has also been an input to the DUNE computing model and the Computing Conceptual Design Report. This operational experience in the UK has prompted development of new systems and features in DUNE computing, particularly in the areas of data management with RUCIO and the DUNE Workflow System by UK institutes funded by the DUNE UK Construction Project.
In-person participation | Yes |
---|