Rényi Bounds on Information Combining

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Christoph Hirche

Externe Organisationen

  • Københavns Universitet
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des Sammelwerks2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten2297-2302
Seitenumfang6
ISBN (elektronisch)9781728164328
PublikationsstatusVeröffentlicht - Juni 2020
Extern publiziertJa
Veranstaltung2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, USA / Vereinigte Staaten
Dauer: 21 Juli 202026 Juli 2020

Publikationsreihe

NameIEEE International Symposium on Information Theory - Proceedings
Band2020-June
ISSN (Print)2157-8095

Abstract

Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

ASJC Scopus Sachgebiete

Zitieren

Rényi Bounds on Information Combining. / Hirche, Christoph.
2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. S. 2297-2302 9174256 (IEEE International Symposium on Information Theory - Proceedings; Band 2020-June).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Hirche, C 2020, Rényi Bounds on Information Combining. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings., 9174256, IEEE International Symposium on Information Theory - Proceedings, Bd. 2020-June, Institute of Electrical and Electronics Engineers Inc., S. 2297-2302, 2020 IEEE International Symposium on Information Theory, ISIT 2020, Los Angeles, USA / Vereinigte Staaten, 21 Juli 2020. https://doi.org/10.1109/ISIT44484.2020.9174256
Hirche, C. (2020). Rényi Bounds on Information Combining. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings (S. 2297-2302). Artikel 9174256 (IEEE International Symposium on Information Theory - Proceedings; Band 2020-June). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT44484.2020.9174256
Hirche C. Rényi Bounds on Information Combining. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2020. S. 2297-2302. 9174256. (IEEE International Symposium on Information Theory - Proceedings). doi: 10.1109/ISIT44484.2020.9174256
Hirche, Christoph. / Rényi Bounds on Information Combining. 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. S. 2297-2302 (IEEE International Symposium on Information Theory - Proceedings).
Download
@inproceedings{262508d22cc645f5adff8a08f4b7fa26,
title = "R{\'e}nyi Bounds on Information Combining",
abstract = "Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to R{\'e}nyi entropies. We give optimal bounds on the conditional R{\'e}nyi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional R{\'e}nyi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of R{\'e}nyi entropies under polar codes.",
author = "Christoph Hirche",
note = "Funding Information: CH acknowledges financial support from the VILLUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059). ; 2020 IEEE International Symposium on Information Theory, ISIT 2020 ; Conference date: 21-07-2020 Through 26-07-2020",
year = "2020",
month = jun,
doi = "10.1109/ISIT44484.2020.9174256",
language = "English",
series = "IEEE International Symposium on Information Theory - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "2297--2302",
booktitle = "2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings",
address = "United States",

}

Download

TY - GEN

T1 - Rényi Bounds on Information Combining

AU - Hirche, Christoph

N1 - Funding Information: CH acknowledges financial support from the VILLUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059).

PY - 2020/6

Y1 - 2020/6

N2 - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

AB - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

UR - http://www.scopus.com/inward/record.url?scp=85090404332&partnerID=8YFLogxK

U2 - 10.1109/ISIT44484.2020.9174256

DO - 10.1109/ISIT44484.2020.9174256

M3 - Conference contribution

AN - SCOPUS:85090404332

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 2297

EP - 2302

BT - 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2020 IEEE International Symposium on Information Theory, ISIT 2020

Y2 - 21 July 2020 through 26 July 2020

ER -