Rényi Bounds on Information Combining

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Christoph Hirche

External Research Organisations

  • University of Copenhagen
View graph of relations

Details

Original languageEnglish
Title of host publication2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2297-2302
Number of pages6
ISBN (electronic)9781728164328
Publication statusPublished - Jun 2020
Externally publishedYes
Event2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, United States
Duration: 21 Jul 202026 Jul 2020

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2020-June
ISSN (Print)2157-8095

Abstract

Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

ASJC Scopus subject areas

Cite this

Rényi Bounds on Information Combining. / Hirche, Christoph.
2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. p. 2297-2302 9174256 (IEEE International Symposium on Information Theory - Proceedings; Vol. 2020-June).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Hirche, C 2020, Rényi Bounds on Information Combining. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings., 9174256, IEEE International Symposium on Information Theory - Proceedings, vol. 2020-June, Institute of Electrical and Electronics Engineers Inc., pp. 2297-2302, 2020 IEEE International Symposium on Information Theory, ISIT 2020, Los Angeles, United States, 21 Jul 2020. https://doi.org/10.1109/ISIT44484.2020.9174256
Hirche, C. (2020). Rényi Bounds on Information Combining. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings (pp. 2297-2302). Article 9174256 (IEEE International Symposium on Information Theory - Proceedings; Vol. 2020-June). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT44484.2020.9174256
Hirche C. Rényi Bounds on Information Combining. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2020. p. 2297-2302. 9174256. (IEEE International Symposium on Information Theory - Proceedings). doi: 10.1109/ISIT44484.2020.9174256
Hirche, Christoph. / Rényi Bounds on Information Combining. 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. pp. 2297-2302 (IEEE International Symposium on Information Theory - Proceedings).
Download
@inproceedings{262508d22cc645f5adff8a08f4b7fa26,
title = "R{\'e}nyi Bounds on Information Combining",
abstract = "Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to R{\'e}nyi entropies. We give optimal bounds on the conditional R{\'e}nyi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional R{\'e}nyi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of R{\'e}nyi entropies under polar codes.",
author = "Christoph Hirche",
note = "Funding Information: CH acknowledges financial support from the VILLUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059). ; 2020 IEEE International Symposium on Information Theory, ISIT 2020 ; Conference date: 21-07-2020 Through 26-07-2020",
year = "2020",
month = jun,
doi = "10.1109/ISIT44484.2020.9174256",
language = "English",
series = "IEEE International Symposium on Information Theory - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "2297--2302",
booktitle = "2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings",
address = "United States",

}

Download

TY - GEN

T1 - Rényi Bounds on Information Combining

AU - Hirche, Christoph

N1 - Funding Information: CH acknowledges financial support from the VILLUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059).

PY - 2020/6

Y1 - 2020/6

N2 - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

AB - Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding.In this work we will generalize the concept to Rényi entropies. We give optimal bounds on the conditional Rényi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional Rényi entropy, we consider four different versions from the literature.Finally, we discuss the application of these bounds to the polarization of Rényi entropies under polar codes.

UR - http://www.scopus.com/inward/record.url?scp=85090404332&partnerID=8YFLogxK

U2 - 10.1109/ISIT44484.2020.9174256

DO - 10.1109/ISIT44484.2020.9174256

M3 - Conference contribution

AN - SCOPUS:85090404332

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 2297

EP - 2302

BT - 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2020 IEEE International Symposium on Information Theory, ISIT 2020

Y2 - 21 July 2020 through 26 July 2020

ER -