Speaker
Description
Data from the Advanced VIRGO interferometer are processed by running search pipelines for a number of expected signals, the most prominent being coalescing compact binaries but also including continuous waves, burst events and detector characterisation studies. Some of the processing needs to be done with as low as possible latency, in order to be able to provide triggers for other observatories, while deep searches are run offline on external computing centres. Thus, data needs also to be reliably and promptly distributed from the EGO site to computer centres in Europe and the US for further analysis and archival storage.
Two of the defining characteristics of Advanced VIRGO computing are the heterogeneity of the activities and the need to interoperate with the LIGO observatories. A very wide array of analysis pipelines differing in scientific target, implementation details and running environment assumptions have to be allowed to run ubiquitously and uniformly on dedicated resources and, in perspective, on heterogeneous infrastructures.
The current status, possible strategies and outlook are discussed.