Loading [MathJax]/extensions/tex2jax.js

FranSys—A Fast Non-Autoregressive Recurrent Neural Network for Multi-Step Ahead Prediction

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Details

OriginalspracheEnglisch
Seiten (von - bis)145130 - 145147
Seitenumfang18
FachzeitschriftIEEE ACCESS
Jahrgang12
Frühes Online-Datum3 Okt. 2024
PublikationsstatusVeröffentlicht - 2024

Abstract

Neural network-based nonlinear system identification is crucial for various multi-step ahead prediction tasks, including model predictive control and digital twins. These applications demand models that are not only accurate but also efficient in training and deployment. While current state-of-the-art neural network-based methods can identify accurate models, they often become prohibitively slow when scaled to achieve high accuracy, limiting their use in resource-constrained or time-critical applications. We propose FranSys, a Fast recurrent neural network-based method for multi-step ahead prediction in non-autoregressive System Identification. FranSys comprises three key innovations: 1) the first non-autoregressive RNN model structure for multi-step ahead prediction that enables much faster training and inference compared to autoregressive RNNs by separating state estimation and prediction into two specialized sub-models, 2) a state distribution alignment training technique that enhances generalizability and 3) a prediction horizon scheduling method that accelerates training by progressively increasing the prediction horizon. We evaluate FranSys on three publicly available benchmark datasets representing diverse systems, comparing its speed and accuracy against state-of-the-art RNN-based multi-step ahead prediction methods. The evaluation includes various prediction horizons, model sizes, and hyperparameter optimization settings, using both our own implementations and those from related work. Results demonstrate that FranSys is 10 to 100 times faster in training and inference with the same and often higher accuracy on test data than state-of-the-art RNN-based multi-step ahead prediction methods, particularly with long prediction horizons. This substantial speed improvement enables the application of larger neural network-based models with longer prediction horizons on resource-constrained systems in time-critical tasks, such as model predictive control and online learning of digital twins. The code of FranSys is publicly available.

ASJC Scopus Sachgebiete

Zitieren

FranSys—A Fast Non-Autoregressive Recurrent Neural Network for Multi-Step Ahead Prediction. / Weber, Daniel O. M.; Gühmann, Clemens; Seel, Thomas.
in: IEEE ACCESS, Jahrgang 12, 2024, S. 145130 - 145147.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Weber DOM, Gühmann C, Seel T. FranSys—A Fast Non-Autoregressive Recurrent Neural Network for Multi-Step Ahead Prediction. IEEE ACCESS. 2024;12:145130 - 145147. Epub 2024 Okt 3. doi: 10.1109/ACCESS.2024.3473014
Download
@article{f7fffff93310428197f36cba4d12e1cf,
title = "FranSys—A Fast Non-Autoregressive Recurrent Neural Network for Multi-Step Ahead Prediction",
abstract = "Neural network-based nonlinear system identification is crucial for various multi-step ahead prediction tasks, including model predictive control and digital twins. These applications demand models that are not only accurate but also efficient in training and deployment. While current state-of-the-art neural network-based methods can identify accurate models, they often become prohibitively slow when scaled to achieve high accuracy, limiting their use in resource-constrained or time-critical applications. We propose FranSys, a Fast recurrent neural network-based method for multi-step ahead prediction in non-autoregressive System Identification. FranSys comprises three key innovations: 1) the first non-autoregressive RNN model structure for multi-step ahead prediction that enables much faster training and inference compared to autoregressive RNNs by separating state estimation and prediction into two specialized sub-models, 2) a state distribution alignment training technique that enhances generalizability and 3) a prediction horizon scheduling method that accelerates training by progressively increasing the prediction horizon. We evaluate FranSys on three publicly available benchmark datasets representing diverse systems, comparing its speed and accuracy against state-of-the-art RNN-based multi-step ahead prediction methods. The evaluation includes various prediction horizons, model sizes, and hyperparameter optimization settings, using both our own implementations and those from related work. Results demonstrate that FranSys is 10 to 100 times faster in training and inference with the same and often higher accuracy on test data than state-of-the-art RNN-based multi-step ahead prediction methods, particularly with long prediction horizons. This substantial speed improvement enables the application of larger neural network-based models with longer prediction horizons on resource-constrained systems in time-critical tasks, such as model predictive control and online learning of digital twins. The code of FranSys is publicly available.",
keywords = "Benchmarking, digital twins, machine learning, model predictive control, neural networks, performance evaluation, prediction methods, recurrent neural networks, supervised learning, system identification",
author = "Weber, {Daniel O. M.} and Clemens G{\"u}hmann and Thomas Seel",
note = "Publisher Copyright: {\textcopyright} 2024 The Authors.",
year = "2024",
doi = "10.1109/ACCESS.2024.3473014",
language = "English",
volume = "12",
pages = "145130 -- 145147",
journal = "IEEE ACCESS",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

Download

TY - JOUR

T1 - FranSys—A Fast Non-Autoregressive Recurrent Neural Network for Multi-Step Ahead Prediction

AU - Weber, Daniel O. M.

AU - Gühmann, Clemens

AU - Seel, Thomas

N1 - Publisher Copyright: © 2024 The Authors.

PY - 2024

Y1 - 2024

N2 - Neural network-based nonlinear system identification is crucial for various multi-step ahead prediction tasks, including model predictive control and digital twins. These applications demand models that are not only accurate but also efficient in training and deployment. While current state-of-the-art neural network-based methods can identify accurate models, they often become prohibitively slow when scaled to achieve high accuracy, limiting their use in resource-constrained or time-critical applications. We propose FranSys, a Fast recurrent neural network-based method for multi-step ahead prediction in non-autoregressive System Identification. FranSys comprises three key innovations: 1) the first non-autoregressive RNN model structure for multi-step ahead prediction that enables much faster training and inference compared to autoregressive RNNs by separating state estimation and prediction into two specialized sub-models, 2) a state distribution alignment training technique that enhances generalizability and 3) a prediction horizon scheduling method that accelerates training by progressively increasing the prediction horizon. We evaluate FranSys on three publicly available benchmark datasets representing diverse systems, comparing its speed and accuracy against state-of-the-art RNN-based multi-step ahead prediction methods. The evaluation includes various prediction horizons, model sizes, and hyperparameter optimization settings, using both our own implementations and those from related work. Results demonstrate that FranSys is 10 to 100 times faster in training and inference with the same and often higher accuracy on test data than state-of-the-art RNN-based multi-step ahead prediction methods, particularly with long prediction horizons. This substantial speed improvement enables the application of larger neural network-based models with longer prediction horizons on resource-constrained systems in time-critical tasks, such as model predictive control and online learning of digital twins. The code of FranSys is publicly available.

AB - Neural network-based nonlinear system identification is crucial for various multi-step ahead prediction tasks, including model predictive control and digital twins. These applications demand models that are not only accurate but also efficient in training and deployment. While current state-of-the-art neural network-based methods can identify accurate models, they often become prohibitively slow when scaled to achieve high accuracy, limiting their use in resource-constrained or time-critical applications. We propose FranSys, a Fast recurrent neural network-based method for multi-step ahead prediction in non-autoregressive System Identification. FranSys comprises three key innovations: 1) the first non-autoregressive RNN model structure for multi-step ahead prediction that enables much faster training and inference compared to autoregressive RNNs by separating state estimation and prediction into two specialized sub-models, 2) a state distribution alignment training technique that enhances generalizability and 3) a prediction horizon scheduling method that accelerates training by progressively increasing the prediction horizon. We evaluate FranSys on three publicly available benchmark datasets representing diverse systems, comparing its speed and accuracy against state-of-the-art RNN-based multi-step ahead prediction methods. The evaluation includes various prediction horizons, model sizes, and hyperparameter optimization settings, using both our own implementations and those from related work. Results demonstrate that FranSys is 10 to 100 times faster in training and inference with the same and often higher accuracy on test data than state-of-the-art RNN-based multi-step ahead prediction methods, particularly with long prediction horizons. This substantial speed improvement enables the application of larger neural network-based models with longer prediction horizons on resource-constrained systems in time-critical tasks, such as model predictive control and online learning of digital twins. The code of FranSys is publicly available.

KW - Benchmarking

KW - digital twins

KW - machine learning

KW - model predictive control

KW - neural networks

KW - performance evaluation

KW - prediction methods

KW - recurrent neural networks

KW - supervised learning

KW - system identification

UR - http://www.scopus.com/inward/record.url?scp=85206837727&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2024.3473014

DO - 10.1109/ACCESS.2024.3473014

M3 - Article

VL - 12

SP - 145130

EP - 145147

JO - IEEE ACCESS

JF - IEEE ACCESS

SN - 2169-3536

ER -

Von denselben Autoren