Data mining and machine learning improve gravitational-wave detector sensitivity

Gabriele Vajente
Phys. Rev. D 105, 102005 – Published 20 May 2022

Abstract

Application of data mining and machine learning techniques can significantly improve the sensitivity of current interferometric gravitational-wave detectors. Such instruments are complex multi-input single-output systems, with close-to-linear dynamics and hundreds of active feedback control loops. We show how the application of brute-force data-mining techniques allows us to discover correlations between auxiliary monitoring channels and the main gravitational-wave output channel. We also discuss the result of the application of a parametric and time-domain noise subtraction algorithm, that allows a significant improvement of the detector sensitivity at frequencies below 30 Hz.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Received 8 March 2022
  • Accepted 10 May 2022

DOI:https://doi.org/10.1103/PhysRevD.105.102005

© 2022 American Physical Society

Physics Subject Headings (PhySH)

Gravitation, Cosmology & Astrophysics

Authors & Affiliations

Gabriele Vajente*

  • LIGO Laboratory, California Institute of Technology, Pasadena, California 91101, USA

  • *vajente@caltech.edu

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 105, Iss. 10 — 15 May 2022

Reuse & Permissions
Access Options
CHORUS

Article Available via CHORUS

Download Accepted Manuscript
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review D

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×