Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • University of Texas at El Paso
View graph of relations

Details

Original languageEnglish
Title of host publication2021 IEEE Symposium Series on Computational Intelligence (SSCI)
Number of pages7
ISBN (electronic)978-1-7281-9048-8
Publication statusPublished - 2021

Abstract

In many practical situations, we know that there is a functional dependence between a quantity $q$ and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class - i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or simple random variables into functions from the desired class. Many of the real-life dependencies are very complex, requiring a large amount of computation time even if we ignore the uncertainty. So, to make simulation of uncertainty practically feasible, we need to make sure that the corresponding simulation algorithm is as fast as possible. In this paper, we show that for this objective, ideas behind neural networks lead to the known Karhunen-Loevc decomposition and interval field techniques - and also that these ideas help us go - when necessary - beyond these techniques.

Keywords

    Interval fields, Karhunen-Loeve decomposition, Neural networks

ASJC Scopus subject areas

Cite this

Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields. / Beer, Michael; Kosheleva, Olga; Kreinovich, Vladik.
2021 IEEE Symposium Series on Computational Intelligence (SSCI). 2021.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Beer, M, Kosheleva, O & Kreinovich, V 2021, Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields. in 2021 IEEE Symposium Series on Computational Intelligence (SSCI). https://doi.org/10.1109/SSCI50451.2021.9660145
Beer, M., Kosheleva, O., & Kreinovich, V. (2021). Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields. In 2021 IEEE Symposium Series on Computational Intelligence (SSCI) https://doi.org/10.1109/SSCI50451.2021.9660145
Beer M, Kosheleva O, Kreinovich V. Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields. In 2021 IEEE Symposium Series on Computational Intelligence (SSCI). 2021 doi: 10.1109/SSCI50451.2021.9660145
Beer, Michael ; Kosheleva, Olga ; Kreinovich, Vladik. / Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields. 2021 IEEE Symposium Series on Computational Intelligence (SSCI). 2021.
Download
@inproceedings{7f7d18aed5714911a0974f70d367bc1a,
title = "Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields.",
abstract = "In many practical situations, we know that there is a functional dependence between a quantity $q$ and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class - i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or simple random variables into functions from the desired class. Many of the real-life dependencies are very complex, requiring a large amount of computation time even if we ignore the uncertainty. So, to make simulation of uncertainty practically feasible, we need to make sure that the corresponding simulation algorithm is as fast as possible. In this paper, we show that for this objective, ideas behind neural networks lead to the known Karhunen-Loevc decomposition and interval field techniques - and also that these ideas help us go - when necessary - beyond these techniques.",
keywords = "Interval fields, Karhunen-Loeve decomposition, Neural networks",
author = "Michael Beer and Olga Kosheleva and Vladik Kreinovich",
note = "Funding Information: This work was supported in part by the National Science Foundation grants 1623190 (A Model of Change for Preparing a New Generation for Professional Practice in Computer Science), and HRD-1834620 and HRD-2034030 (CAHSI Includes), and by the AT&T Fellowship in Information Technology. It was also supported by the program of the development of the Scientific-Educational Mathematical Center of Volga Federal District No. 075-02-2020-1478, and by a grant from the Hungarian National Research, Development and Innovation Office (NRDI).",
year = "2021",
doi = "10.1109/SSCI50451.2021.9660145",
language = "English",
isbn = "978-1-7281-9049-5",
booktitle = "2021 IEEE Symposium Series on Computational Intelligence (SSCI)",

}

Download

TY - GEN

T1 - Uncertainty: Ideas Behind Neural Networks Lead Us Beyond KL- Decomposition and Interval Fields.

AU - Beer, Michael

AU - Kosheleva, Olga

AU - Kreinovich, Vladik

N1 - Funding Information: This work was supported in part by the National Science Foundation grants 1623190 (A Model of Change for Preparing a New Generation for Professional Practice in Computer Science), and HRD-1834620 and HRD-2034030 (CAHSI Includes), and by the AT&T Fellowship in Information Technology. It was also supported by the program of the development of the Scientific-Educational Mathematical Center of Volga Federal District No. 075-02-2020-1478, and by a grant from the Hungarian National Research, Development and Innovation Office (NRDI).

PY - 2021

Y1 - 2021

N2 - In many practical situations, we know that there is a functional dependence between a quantity $q$ and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class - i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or simple random variables into functions from the desired class. Many of the real-life dependencies are very complex, requiring a large amount of computation time even if we ignore the uncertainty. So, to make simulation of uncertainty practically feasible, we need to make sure that the corresponding simulation algorithm is as fast as possible. In this paper, we show that for this objective, ideas behind neural networks lead to the known Karhunen-Loevc decomposition and interval field techniques - and also that these ideas help us go - when necessary - beyond these techniques.

AB - In many practical situations, we know that there is a functional dependence between a quantity $q$ and quantities a1, ..., an, but the exact form of this dependence is only known with uncertainty. In some cases, we only know the class of possible functions describing this dependence. In other cases, we also know the probabilities of different functions from this class - i.e., we know the corresponding random field or random process. To solve problems related to such a dependence, it is desirable to be able to simulate the corresponding functions, i.e., to have algorithms that transform simple intervals or simple random variables into functions from the desired class. Many of the real-life dependencies are very complex, requiring a large amount of computation time even if we ignore the uncertainty. So, to make simulation of uncertainty practically feasible, we need to make sure that the corresponding simulation algorithm is as fast as possible. In this paper, we show that for this objective, ideas behind neural networks lead to the known Karhunen-Loevc decomposition and interval field techniques - and also that these ideas help us go - when necessary - beyond these techniques.

KW - Interval fields

KW - Karhunen-Loeve decomposition

KW - Neural networks

UR - http://www.scopus.com/inward/record.url?scp=85125805193&partnerID=8YFLogxK

U2 - 10.1109/SSCI50451.2021.9660145

DO - 10.1109/SSCI50451.2021.9660145

M3 - Conference contribution

SN - 978-1-7281-9049-5

BT - 2021 IEEE Symposium Series on Computational Intelligence (SSCI)

ER -

By the same author(s)