Loading [MathJax]/extensions/tex2jax.js

Genetic-algorithm-optimized neural networks for gravitational wave classification

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Dwyer S. Deighan
  • Scott E. Field
  • Collin D. Capano
  • Gaurav Khanna

Research Organisations

External Research Organisations

  • University of Massachusetts Amherst
  • Max Planck Institute for Gravitational Physics (Albert Einstein Institute)
  • University of Rhode Island

Details

Original languageEnglish
Pages (from-to)13859-13883
Number of pages25
JournalNeural Computing and Applications
Volume33
Issue number20
Early online date24 Apr 2021
Publication statusPublished - 1 Oct 2021

Abstract

Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Despite the success of matched filtering, due to its computational cost, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 781e.g., statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, our GA hyperparameter optimization strategy can be applied within other machine learning settings.

Keywords

    Evolutionary algorithms, Convolutional neural networks, Signal detection, Matched filters, Gravitational waves

ASJC Scopus subject areas

Cite this

Genetic-algorithm-optimized neural networks for gravitational wave classification. / Deighan, Dwyer S.; Field, Scott E.; Capano, Collin D. et al.
In: Neural Computing and Applications, Vol. 33, No. 20, 01.10.2021, p. 13859-13883.

Research output: Contribution to journalArticleResearchpeer review

Deighan DS, Field SE, Capano CD, Khanna G. Genetic-algorithm-optimized neural networks for gravitational wave classification. Neural Computing and Applications. 2021 Oct 1;33(20):13859-13883. Epub 2021 Apr 24. doi: 10.48550/arXiv.2010.04340, 10.1007/s00521-021-06024-4
Deighan, Dwyer S. ; Field, Scott E. ; Capano, Collin D. et al. / Genetic-algorithm-optimized neural networks for gravitational wave classification. In: Neural Computing and Applications. 2021 ; Vol. 33, No. 20. pp. 13859-13883.
Download
@article{b533e1a6b4514275ab1970a911a02cd1,
title = "Genetic-algorithm-optimized neural networks for gravitational wave classification",
abstract = "Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Despite the success of matched filtering, due to its computational cost, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 781e.g., statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, our GA hyperparameter optimization strategy can be applied within other machine learning settings.",
keywords = "Evolutionary algorithms, Convolutional neural networks, Signal detection, Matched filters, Gravitational waves",
author = "Deighan, {Dwyer S.} and Field, {Scott E.} and Capano, {Collin D.} and Gaurav Khanna",
note = "Funding Information: We would like to thank Prayush Kumar, Jun Li, Caroline Mallary, Eamonn O{\textquoteright}Shea, and Matthew Wise for helpful discussions, and Vishal Tiwari for writing scripts used to compute efficiency curves. S. E. F. and D. S. D. are partially supported by NSF Grant PHY-1806665 and DMS-1912716. G.K. acknowledges research support from NSF Grants Nos. PHY-1701284, PHY-2010685 and DMS-1912716. All authors acknowledge research support from ONR/DURIP Grant No. N00014181255, which funds the computational resources used in our work. D. S. D. is partially supported by the Massachusetts Space Grant Consortium.",
year = "2021",
month = oct,
day = "1",
doi = "10.48550/arXiv.2010.04340",
language = "English",
volume = "33",
pages = "13859--13883",
journal = "Neural Computing and Applications",
issn = "0941-0643",
publisher = "Springer London",
number = "20",

}

Download

TY - JOUR

T1 - Genetic-algorithm-optimized neural networks for gravitational wave classification

AU - Deighan, Dwyer S.

AU - Field, Scott E.

AU - Capano, Collin D.

AU - Khanna, Gaurav

N1 - Funding Information: We would like to thank Prayush Kumar, Jun Li, Caroline Mallary, Eamonn O’Shea, and Matthew Wise for helpful discussions, and Vishal Tiwari for writing scripts used to compute efficiency curves. S. E. F. and D. S. D. are partially supported by NSF Grant PHY-1806665 and DMS-1912716. G.K. acknowledges research support from NSF Grants Nos. PHY-1701284, PHY-2010685 and DMS-1912716. All authors acknowledge research support from ONR/DURIP Grant No. N00014181255, which funds the computational resources used in our work. D. S. D. is partially supported by the Massachusetts Space Grant Consortium.

PY - 2021/10/1

Y1 - 2021/10/1

N2 - Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Despite the success of matched filtering, due to its computational cost, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 781e.g., statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, our GA hyperparameter optimization strategy can be applied within other machine learning settings.

AB - Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Despite the success of matched filtering, due to its computational cost, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 781e.g., statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, our GA hyperparameter optimization strategy can be applied within other machine learning settings.

KW - Evolutionary algorithms

KW - Convolutional neural networks

KW - Signal detection

KW - Matched filters

KW - Gravitational waves

UR - http://www.scopus.com/inward/record.url?scp=85105342933&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2010.04340

DO - 10.48550/arXiv.2010.04340

M3 - Article

VL - 33

SP - 13859

EP - 13883

JO - Neural Computing and Applications

JF - Neural Computing and Applications

SN - 0941-0643

IS - 20

ER -