Speaker
Description
High energy physics research has always leveraged bleeding edge
computing solutions. Now, as experiments turn their focus to high precision measurements, data need to be understood and analyzed better than ever; this requires, among other things, new coding paradigms that enforce reproducibility and improve usability.
In this talk I will show how it is possible to enable continuous integration with the CMS dataset by using the gitlab CI and harnessing the computing resources made available by the INFN CMS Analaysis Facility. In particular, I show that it is possible to integrate the submission of jobs to HTCondor into the gitlab CI thus facilitating the handling of big datasets.
In this way analysts will be able to quickly run different tests on their data, perform different analyses and, at the same time, keep tracks of all the changes made.