Speaker
Description
Einstein Telescope (ET) is a project of third generation gravitational-wave (GW) detector with a planned sensitivity 10 times better than current detectors such as Advanced LIGO and Advanced Virgo.
The high rate of GW signals expected in the data will pose several data analysis challenges, like the ability to disentangle overlapping signals or the need to dimension the computational resources required to treat all the candidate events.
We explore the behaviour and the performances of a data analysis pipeline designed to search for unmodelled GW signals with duration 1-1000s on a mock dataset that consists of $1$ month of data following ET design sensitivity on top of which is added a realistic distribution of compact binary coalescence (CBC) signals.
Unmodelled searches are intrinsically less sensitive to CBC signals than template-based searches, but are computationally cheaper and more robust to uncertainties in the waveforms.
This search recovers $38\%$ of the total number of injected binary black hole (BBH) signals, including $89\%$ of the systems with a total mass above $100$ M$_\odot$, as well as the majority of binary neutron star (BNS) signals closer than 850 Mpc ($z=0.17$). It is also able to estimate the chirp mass of the recovered BNS with an average precision of $1.3 \%$.
Therefore, we show that this unmodelled search is able to detect a substantial amount of CBC events at a relatively low computational cost, which makes it interesting for low-latency analyses and independent validation of detections made by matched filtering pipelines.
We also find that the presence of many CBC signals only marginally impacts the sensitivity of the search to other kinds of unmodelled long-duration transient signals, by $\sim 3\%$ in average.