Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | Information Processing and Management of Uncertainty in Knowledge-Based Systems |
Untertitel | 18th International Conference, IPMU 2020, Proceedings |
Herausgeber/-innen | Marie-Jeanne Lesot, Susana Vieira, Marek Z. Reformat, João Paulo Carvalho, Anna Wilbik, Bernadette Bouchon-Meunier, Ronald R. Yager |
Erscheinungsort | Cham |
Seiten | 70-79 |
Seitenumfang | 10 |
Band | 1 |
ISBN (elektronisch) | 9783030501464 |
Publikationsstatus | Veröffentlicht - 2020 |
Veranstaltung | 18th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems - Lisbon, Portugal, Lissabon, Portugal Dauer: 15 Juni 2020 → 19 Juni 2020 Konferenznummer: 18 https://ipmu2020.inesc-id.pt/ |
Publikationsreihe
Name | Communications in Computer and Information Science |
---|---|
Band | 1237 |
ISSN (Print) | 1865-0929 |
ISSN (elektronisch) | 1865-0937 |
Abstract
In many practical situations, we only know the interval containing the quantity of interest, we have no information about the probabilities of different values within this interval. In contrast to the cases when we know the distributions and can thus use Monte-Carlo simulations, processing such interval uncertainty is difficult – crudely speaking, because we need to try all possible distributions on this interval. Sometimes, the problem can be simplified: namely, for estimating the range of values of some characteristics of the distribution, it is possible to select a single distribution (or a small family of distributions) whose analysis provides a good understanding of the situation. The most known case is when we are estimating the largest possible value of Shannon’s entropy: in this case, it is sufficient to consider the uniform distribution on the interval. Interesting, estimating other characteristics leads to the selection of the same uniform distribution: e.g., estimating the largest possible values of generalized entropy or of some sensitivity-related characteristics. In this paper, we provide a general explanation of why uniform distribution appears in different situations – namely, it appears every time we have a permutation-invariant optimization problem with the unique optimum. We also discuss what happens if we have an optimization problem that attains its optimum at several different distributions – this happens, e.g., when we are estimating the smallest possible value of Shannon’s entropy (or of its generalizations).
ASJC Scopus Sachgebiete
- Informatik (insg.)
- Allgemeine Computerwissenschaft
- Mathematik (insg.)
- Allgemeine Mathematik
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
Information Processing and Management of Uncertainty in Knowledge-Based Systems: 18th International Conference, IPMU 2020, Proceedings. Hrsg. / Marie-Jeanne Lesot; Susana Vieira; Marek Z. Reformat; João Paulo Carvalho; Anna Wilbik; Bernadette Bouchon-Meunier; Ronald R. Yager. Band 1 Cham, 2020. S. 70-79 (Communications in Computer and Information Science; Band 1237).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Which Distributions (or Families of Distributions) Best Represent Interval Uncertainty
T2 - 18th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems
AU - Beer, Michael
AU - Urenda, Julio
AU - Kosheleva, Olga
AU - Kreinovich, Vladik
N1 - Conference code: 18
PY - 2020
Y1 - 2020
N2 - In many practical situations, we only know the interval containing the quantity of interest, we have no information about the probabilities of different values within this interval. In contrast to the cases when we know the distributions and can thus use Monte-Carlo simulations, processing such interval uncertainty is difficult – crudely speaking, because we need to try all possible distributions on this interval. Sometimes, the problem can be simplified: namely, for estimating the range of values of some characteristics of the distribution, it is possible to select a single distribution (or a small family of distributions) whose analysis provides a good understanding of the situation. The most known case is when we are estimating the largest possible value of Shannon’s entropy: in this case, it is sufficient to consider the uniform distribution on the interval. Interesting, estimating other characteristics leads to the selection of the same uniform distribution: e.g., estimating the largest possible values of generalized entropy or of some sensitivity-related characteristics. In this paper, we provide a general explanation of why uniform distribution appears in different situations – namely, it appears every time we have a permutation-invariant optimization problem with the unique optimum. We also discuss what happens if we have an optimization problem that attains its optimum at several different distributions – this happens, e.g., when we are estimating the smallest possible value of Shannon’s entropy (or of its generalizations).
AB - In many practical situations, we only know the interval containing the quantity of interest, we have no information about the probabilities of different values within this interval. In contrast to the cases when we know the distributions and can thus use Monte-Carlo simulations, processing such interval uncertainty is difficult – crudely speaking, because we need to try all possible distributions on this interval. Sometimes, the problem can be simplified: namely, for estimating the range of values of some characteristics of the distribution, it is possible to select a single distribution (or a small family of distributions) whose analysis provides a good understanding of the situation. The most known case is when we are estimating the largest possible value of Shannon’s entropy: in this case, it is sufficient to consider the uniform distribution on the interval. Interesting, estimating other characteristics leads to the selection of the same uniform distribution: e.g., estimating the largest possible values of generalized entropy or of some sensitivity-related characteristics. In this paper, we provide a general explanation of why uniform distribution appears in different situations – namely, it appears every time we have a permutation-invariant optimization problem with the unique optimum. We also discuss what happens if we have an optimization problem that attains its optimum at several different distributions – this happens, e.g., when we are estimating the smallest possible value of Shannon’s entropy (or of its generalizations).
KW - Interval uncertainty
KW - Maximum Entropy approach
KW - Sensitivity analysis
KW - Uniform distribution
UR - http://www.scopus.com/inward/record.url?scp=85086249018&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-50146-4_6
DO - 10.1007/978-3-030-50146-4_6
M3 - Conference contribution
AN - SCOPUS:85086249018
SN - 9783030501457
VL - 1
T3 - Communications in Computer and Information Science
SP - 70
EP - 79
BT - Information Processing and Management of Uncertainty in Knowledge-Based Systems
A2 - Lesot, Marie-Jeanne
A2 - Vieira, Susana
A2 - Reformat, Marek Z.
A2 - Carvalho, João Paulo
A2 - Wilbik, Anna
A2 - Bouchon-Meunier, Bernadette
A2 - Yager, Ronald R.
CY - Cham
Y2 - 15 June 2020 through 19 June 2020
ER -