Abstract:
Several theoretical waveform models have been developed over the years to capture the gravitational wave emission from the dynamical evolution of compact binary systems of neutron stars and black holes. As ground-based detectors improve their sensitivity at low frequencies, the real-time computation of these waveforms can become computationally expensive, exacerbating the steep cost of rapidly reconstructing source parameters using Bayesian methods. This paper describes an efficient numerical algorithm for generating high-fidelity interpolated compact binary waveforms at an arbitrary point in the signal manifold by leveraging computational linear algebra techniques such as singular value decomposition and meshfree approximation. The results are presented for the time-domain \texttt{NRHybSur3dq8} inspiral-merger-ringdown (IMR) waveform model that is fine tuned to numerical relativity simulations and parameterized by the two component-masses and two aligned spins. For demonstration, we target a specific region of the intrinsic parameter space inspired by the previously inferred parameters of the \texttt{GW200311\_115853} event -- a binary black hole system whose merger was recorded by the network of advanced-LIGO and Virgo detectors during the third observation run. We show that the meshfree interpolated waveforms can be evaluated in ∼2.3 ms, which is about ×38 faster than its brute-force (frequency-domain tapered) implementation in the \textsc{PyCBC} software package at a median accuracy of ∼O(10−5). The algorithm is computationally efficient and scales favourably with an increasing number of dimensions of the parameter space. This technique may find use in rapid parameter estimation and source reconstruction studies.