Speaker
Description
The Einstein Telescope will be able to observe a sky volume one thousand times larger than the second generation observatories and this will be reflected in a higher observation rate. The physics information contained in the strain time series will increase, while on the machine side the size of the raw data from the interferometers will scale with the number and the complexity of the detectors. To meet ET specific computing needs, an adequate choice of the technologies, the tools and the framework to handle the collected data, share them among the interested users and enable their offline analysis is mandatory.
The solution currently under test for the data management and distribution is based on Rucio and on the concept of Data Lake. Rucio is a tool originally developed in the high energy physics domain and now used by many HEP and non-HEP experiments. This talk will report on the status of the test setup in place for ET and on the development of a multi-Research Infrastructure access to the data, in view of the co-operation with the Cosmic Explorer. On the computing side, since ET is expected to begin the data taking in no less than ten years, it is crucial to keep up with the technology that is always improving and to test the new architectures which gradually become available. A computing cluster dedicated to Technology Tracking is under construction in the scope of the ETIC project: in this talk an update on the status of this deployment will be given.