List of all seminars

MINDS Seminar Series | Sanghoon Oh (NIMS) - Machine Learning Applications in Gravitational Wave Data Analysis

Date 2021-03-23 ~ 2021-03-23 Time 16:00:00 ~ 17:00:00
Speaker Sanghoon Oh Affiliation National Institute for Mathematical Sciences, NIMS
Place Online streaming (Zoom) Streaming link ID : 688 896 1076 / PW : letmein
Topic Machine Learning Applications in Gravitational Wave Data Analysis
Contents I will briefly overview the data analysis of gravitational waves (GW) and present our recent study on the deep learning application of the gravitational waveform model. The 1st gravitational wave GW150914 was detected by LIGO and Virgo in 2015. The discovery had been made almost 100 years after the theoretical prediction of its existence by Albert Einstein's general theory of relativity in 1916. The technical difficulties reside not only in detection technology and but also in the challenges in data analysis. Matched filtering combined with chi-squared statistic led to the successful detection of GW signals buried in non-stationary and non-Gaussian noise. Bayesian inference based estimation of physical parameters of the GW progenitor provided its location in the sky, which unveiled the link between the short -duration gamma-ray burst and merger of neutron stars. In the first part of this talk, I will briefly overview gravitational wave data analysis which eventually led to the successful detection of the gravitational waves. As in other fields in sciences, there have been many efforts to apply machine learning algorithms to the gravitational wave data analysis in order to overcome many issues such as computational costs, detection of unmodelled GW signals, and latency of parameter estimation. In the second part of this talk, I will present our recent study using sequence-to-sequence to model the gravitational waveforms from merging binary black holes. Gravitational waveforms can be most precisely modelled by numerical relativity (NR). However, it is not efficient to use NR to compute all waveforms of GW signals that are required in data analysis using NR, because it is heavily expensive in computation. Our study shows that the deep learning model can potentially reduce the computational cost significantly, while retaining the accuracy of the waveforms.