Speaker
Description
Third-generation ground-based gravitational wave detectors such as the Einstein Telescope are expected to significantly advance our understanding of compact binary coalescences. One of the most critical challenges in data analysis for the Einstein Telescope is that of overlapping signals. With a tenfold improvement in sensitivity, the Einstein Telescope will be able to detect binary black hole and binary neutron star coalescences with expected rates of up to ~10⁵ events per year. Moreover, the extended range toward lower frequencies will allow the detector to observe these signals for longer durations compared to current-generation detectors. While this creates the opportunity to deepen our knowledge of these sources, detectable signals will inevitably overlap. This poses a severe challenge to parameter estimation analysis pipelines. We need a faster, unbiased parameter estimation strategy.
In this talk, we will describe a promising solution to address this challenge: a deep learning approach that combines the power of two state-of-the-art machine learning architectures, Transformers and Normalizing Flows. In particular, we present the first application of a Transformer encoder for gravitational wave data analysis. This architecture is capable of capturing complex, varying-range dependencies, and we use it to extract the information in the data. We then employ Normalizing Flows to estimate the high-dimensional posterior distributions of the overlapped signals.
We will present the results from training this network architecture, demonstrating its effectiveness in handling three overlapping signals simultaneously, and discussing how this deep learning method represents a promising solution to the problem, along with its potential extensions and improvements.