Neural network guided adjoint computations in dual weighted residual error estimation

Research output: Contribution to journalArticleResearchpeer review

View graph of relations

Details

Original languageEnglish
Article number62
JournalSN Applied Sciences
Volume4
Issue number2
Early online date31 Jan 2022
Publication statusPublished - Feb 2022

Abstract

Abstract: In this work, we are concerned with neural network guided goal-oriented a posteriori error estimation and adaptivity using the dual weighted residual method. The primal problem is solved using classical Galerkin finite elements. The adjoint problem is solved in strong form with a feedforward neural network using two or three hidden layers. The main objective of our approach is to explore alternatives for solving the adjoint problem with greater potential of a numerical cost reduction. The proposed algorithm is based on the general goal-oriented error estimation theorem including both linear and nonlinear stationary partial differential equations and goal functionals. Our developments are substantiated with some numerical experiments that include comparisons of neural network computed adjoints and classical finite element solutions of the adjoints. In the programming software, the open-source library deal.II is successfully coupled with LibTorch, the PyTorch C++ application programming interface. Article Highlights: Adjoint approximation with feedforward neural network in dual-weighted residual error estimation.Side-by-side comparisons for accuracy and computational cost with classical finite element computations.Numerical experiments for linear and nonlinear problems yielding excellent effectivity indices.

Keywords

    A posteriori error estimation, Adjoint, Deal.II, Dual weighted residuals, LibTorch, Neural network

ASJC Scopus subject areas

Cite this

Neural network guided adjoint computations in dual weighted residual error estimation. / Roth, Julian; Schröder, Max; Wick, Thomas.
In: SN Applied Sciences, Vol. 4, No. 2, 62, 02.2022.

Research output: Contribution to journalArticleResearchpeer review

Roth J, Schröder M, Wick T. Neural network guided adjoint computations in dual weighted residual error estimation. SN Applied Sciences. 2022 Feb;4(2):62. Epub 2022 Jan 31. doi: 10.1007/s42452-022-04938-9
Roth, Julian ; Schröder, Max ; Wick, Thomas. / Neural network guided adjoint computations in dual weighted residual error estimation. In: SN Applied Sciences. 2022 ; Vol. 4, No. 2.
Download
@article{46ada52da9dd46a9a8b7d16975710a4f,
title = "Neural network guided adjoint computations in dual weighted residual error estimation",
abstract = "Abstract: In this work, we are concerned with neural network guided goal-oriented a posteriori error estimation and adaptivity using the dual weighted residual method. The primal problem is solved using classical Galerkin finite elements. The adjoint problem is solved in strong form with a feedforward neural network using two or three hidden layers. The main objective of our approach is to explore alternatives for solving the adjoint problem with greater potential of a numerical cost reduction. The proposed algorithm is based on the general goal-oriented error estimation theorem including both linear and nonlinear stationary partial differential equations and goal functionals. Our developments are substantiated with some numerical experiments that include comparisons of neural network computed adjoints and classical finite element solutions of the adjoints. In the programming software, the open-source library deal.II is successfully coupled with LibTorch, the PyTorch C++ application programming interface. Article Highlights: Adjoint approximation with feedforward neural network in dual-weighted residual error estimation.Side-by-side comparisons for accuracy and computational cost with classical finite element computations.Numerical experiments for linear and nonlinear problems yielding excellent effectivity indices.",
keywords = "A posteriori error estimation, Adjoint, Deal.II, Dual weighted residuals, LibTorch, Neural network",
author = "Julian Roth and Max Schr{\"o}der and Thomas Wick",
note = "Funding Information: This work is supported by the Deutsche Forschungsgemeinschaft (DFG) under Germany{\textquoteright}s Excellence Strategy within the cluster of Excellence PhoenixD (EXC 2122, Project ID 390833453). Moreover, we thank the anonymous reviewers for several suggestions that helped to improve the paper. ",
year = "2022",
month = feb,
doi = "10.1007/s42452-022-04938-9",
language = "English",
volume = "4",
number = "2",

}

Download

TY - JOUR

T1 - Neural network guided adjoint computations in dual weighted residual error estimation

AU - Roth, Julian

AU - Schröder, Max

AU - Wick, Thomas

N1 - Funding Information: This work is supported by the Deutsche Forschungsgemeinschaft (DFG) under Germany’s Excellence Strategy within the cluster of Excellence PhoenixD (EXC 2122, Project ID 390833453). Moreover, we thank the anonymous reviewers for several suggestions that helped to improve the paper.

PY - 2022/2

Y1 - 2022/2

N2 - Abstract: In this work, we are concerned with neural network guided goal-oriented a posteriori error estimation and adaptivity using the dual weighted residual method. The primal problem is solved using classical Galerkin finite elements. The adjoint problem is solved in strong form with a feedforward neural network using two or three hidden layers. The main objective of our approach is to explore alternatives for solving the adjoint problem with greater potential of a numerical cost reduction. The proposed algorithm is based on the general goal-oriented error estimation theorem including both linear and nonlinear stationary partial differential equations and goal functionals. Our developments are substantiated with some numerical experiments that include comparisons of neural network computed adjoints and classical finite element solutions of the adjoints. In the programming software, the open-source library deal.II is successfully coupled with LibTorch, the PyTorch C++ application programming interface. Article Highlights: Adjoint approximation with feedforward neural network in dual-weighted residual error estimation.Side-by-side comparisons for accuracy and computational cost with classical finite element computations.Numerical experiments for linear and nonlinear problems yielding excellent effectivity indices.

AB - Abstract: In this work, we are concerned with neural network guided goal-oriented a posteriori error estimation and adaptivity using the dual weighted residual method. The primal problem is solved using classical Galerkin finite elements. The adjoint problem is solved in strong form with a feedforward neural network using two or three hidden layers. The main objective of our approach is to explore alternatives for solving the adjoint problem with greater potential of a numerical cost reduction. The proposed algorithm is based on the general goal-oriented error estimation theorem including both linear and nonlinear stationary partial differential equations and goal functionals. Our developments are substantiated with some numerical experiments that include comparisons of neural network computed adjoints and classical finite element solutions of the adjoints. In the programming software, the open-source library deal.II is successfully coupled with LibTorch, the PyTorch C++ application programming interface. Article Highlights: Adjoint approximation with feedforward neural network in dual-weighted residual error estimation.Side-by-side comparisons for accuracy and computational cost with classical finite element computations.Numerical experiments for linear and nonlinear problems yielding excellent effectivity indices.

KW - A posteriori error estimation

KW - Adjoint

KW - Deal.II

KW - Dual weighted residuals

KW - LibTorch

KW - Neural network

UR - http://www.scopus.com/inward/record.url?scp=85123986247&partnerID=8YFLogxK

U2 - 10.1007/s42452-022-04938-9

DO - 10.1007/s42452-022-04938-9

M3 - Article

AN - SCOPUS:85123986247

VL - 4

JO - SN Applied Sciences

JF - SN Applied Sciences

IS - 2

M1 - 62

ER -

By the same author(s)