Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings |
Herausgeber (Verlag) | Institute of Electrical and Electronics Engineers Inc. |
Seiten | 2297-2302 |
Seitenumfang | 6 |
ISBN (elektronisch) | 9781728164328 |
Publikationsstatus | Veröffentlicht - Juni 2020 |
Extern publiziert | Ja |
Veranstaltung | 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, USA / Vereinigte Staaten Dauer: 21 Juli 2020 → 26 Juli 2020 |
Publikationsreihe
Name | IEEE International Symposium on Information Theory - Proceedings |
---|---|
Band | 2020-June |
ISSN (Print) | 2157-8095 |
Abstract
Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.
ASJC Scopus Sachgebiete
- Mathematik (insg.)
- Theoretische Informatik
- Informatik (insg.)
- Information systems
- Mathematik (insg.)
- Modellierung und Simulation
- Mathematik (insg.)
- Angewandte Mathematik
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. S. 2297-2302 9174256 (IEEE International Symposium on Information Theory - Proceedings; Band 2020-June).
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Rényi Bounds on Information Combining
AU - Hirche, Christoph
N1 - Funding Information: CH acknowledges financial support from the VILLUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059).
PY - 2020/6
Y1 - 2020/6
N2 - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.
AB - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.
UR - http://www.scopus.com/inward/record.url?scp=85090404332&partnerID=8YFLogxK
U2 - 10.1109/ISIT44484.2020.9174256
DO - 10.1109/ISIT44484.2020.9174256
M3 - Conference contribution
AN - SCOPUS:85090404332
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 2297
EP - 2302
BT - 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Symposium on Information Theory, ISIT 2020
Y2 - 21 July 2020 through 26 July 2020
ER -