An Alphabet-Size Bound for the Information Bottleneck Function

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Autoren

  • Christoph Hirche
  • Andreas Winter

Externe Organisationen

  • Københavns Universitet
  • Universidad Autónoma de Barcelona (UAB)
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Titel des Sammelwerks2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
Seiten2383-2388
Seitenumfang6
ISBN (elektronisch)9781728164328
PublikationsstatusVeröffentlicht - Juni 2020
Extern publiziertJa
Veranstaltung2020 IEEE International Symposium on Information Theory, ISIT 2020 - Los Angeles, USA / Vereinigte Staaten
Dauer: 21 Juli 202026 Juli 2020

Publikationsreihe

NameIEEE International Symposium on Information Theory - Proceedings
Band2020-June
ISSN (Print)2157-8095

Abstract

The information bottleneck function gives a measure of optimal preservation of correlation between some random variable X and some side information Y while compressing X into a new random variable W with bounded remaining correlation to X. As such, the information bottleneck has found many natural applications in machine learning, coding and video compression. The main objective in order to calculate the information bottleneck is to find the optimal representation on W. This could in principle be arbitrarily complicated, but fortunately it is known that the cardinality of W can be restricted as |\mathcal{W}| \leq |\mathcal{X}| + 1 which makes the calculation possible for finite |\mathcal{X}|. Now, for many practical applications, e.g. in machine learning, X represents a potentially very large data space, while Y is from a comparably small set of labels. This raises the question whether the known cardinality bound can be improved in such situations. We show that the information bottleneck function can always be approximated up to an error \delta (\varepsilon,\;|\mathcal{Y}|) with a cardinality |\mathcal{W}| \leq f( \in,\;|\mathcal{Y}|), for explicitly given functions δ and f of an approximation parameter ϵ > 0 and the cardinality of \mathcal{Y}.Finally, we generalize the known cardinality boundsY to the case were some of the random variables represent quantum information.

ASJC Scopus Sachgebiete

Zitieren

An Alphabet-Size Bound for the Information Bottleneck Function. / Hirche, Christoph; Winter, Andreas.
2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. S. 2383-2388 9174416 (IEEE International Symposium on Information Theory - Proceedings; Band 2020-June).

Publikation: Beitrag in Buch/Bericht/Sammelwerk/KonferenzbandAufsatz in KonferenzbandForschungPeer-Review

Hirche, C & Winter, A 2020, An Alphabet-Size Bound for the Information Bottleneck Function. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings., 9174416, IEEE International Symposium on Information Theory - Proceedings, Bd. 2020-June, Institute of Electrical and Electronics Engineers Inc., S. 2383-2388, 2020 IEEE International Symposium on Information Theory, ISIT 2020, Los Angeles, USA / Vereinigte Staaten, 21 Juli 2020. https://doi.org/10.1109/ISIT44484.2020.9174416
Hirche, C., & Winter, A. (2020). An Alphabet-Size Bound for the Information Bottleneck Function. In 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings (S. 2383-2388). Artikel 9174416 (IEEE International Symposium on Information Theory - Proceedings; Band 2020-June). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ISIT44484.2020.9174416
Hirche C, Winter A. An Alphabet-Size Bound for the Information Bottleneck Function. in 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2020. S. 2383-2388. 9174416. (IEEE International Symposium on Information Theory - Proceedings). doi: 10.1109/ISIT44484.2020.9174416
Hirche, Christoph ; Winter, Andreas. / An Alphabet-Size Bound for the Information Bottleneck Function. 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2020. S. 2383-2388 (IEEE International Symposium on Information Theory - Proceedings).
Download
@inproceedings{c4b1e3b793e94b7b8c28c14ec8083ac7,
title = "An Alphabet-Size Bound for the Information Bottleneck Function",
abstract = "The information bottleneck function gives a measure of optimal preservation of correlation between some random variable X and some side information Y while compressing X into a new random variable W with bounded remaining correlation to X. As such, the information bottleneck has found many natural applications in machine learning, coding and video compression. The main objective in order to calculate the information bottleneck is to find the optimal representation on W. This could in principle be arbitrarily complicated, but fortunately it is known that the cardinality of W can be restricted as |\mathcal{W}| \leq |\mathcal{X}| + 1 which makes the calculation possible for finite |\mathcal{X}|. Now, for many practical applications, e.g. in machine learning, X represents a potentially very large data space, while Y is from a comparably small set of labels. This raises the question whether the known cardinality bound can be improved in such situations. We show that the information bottleneck function can always be approximated up to an error \delta (\varepsilon,\;|\mathcal{Y}|) with a cardinality |\mathcal{W}| \leq f( \in,\;|\mathcal{Y}|), for explicitly given functions δ and f of an approximation parameter ϵ > 0 and the cardinality of \mathcal{Y}.Finally, we generalize the known cardinality boundsY to the case were some of the random variables represent quantum information.",
author = "Christoph Hirche and Andreas Winter",
note = "Funding Information: Acknowledgments. The authors thank Axel Foley for discussions on information recovery. CH was supported by the VIL-LUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059). AW was supported by the Spanish MINECO, projects FIS2016-86681-P, with the support of FEDER funds; and by the Generalitat de Catalunya, CIRIT project 2014-SGR-966. ; 2020 IEEE International Symposium on Information Theory, ISIT 2020 ; Conference date: 21-07-2020 Through 26-07-2020",
year = "2020",
month = jun,
doi = "10.1109/ISIT44484.2020.9174416",
language = "English",
series = "IEEE International Symposium on Information Theory - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "2383--2388",
booktitle = "2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings",
address = "United States",

}

Download

TY - GEN

T1 - An Alphabet-Size Bound for the Information Bottleneck Function

AU - Hirche, Christoph

AU - Winter, Andreas

N1 - Funding Information: Acknowledgments. The authors thank Axel Foley for discussions on information recovery. CH was supported by the VIL-LUM FONDEN via the QMATH Centre of Excellence (Grant no. 10059). AW was supported by the Spanish MINECO, projects FIS2016-86681-P, with the support of FEDER funds; and by the Generalitat de Catalunya, CIRIT project 2014-SGR-966.

PY - 2020/6

Y1 - 2020/6

N2 - The information bottleneck function gives a measure of optimal preservation of correlation between some random variable X and some side information Y while compressing X into a new random variable W with bounded remaining correlation to X. As such, the information bottleneck has found many natural applications in machine learning, coding and video compression. The main objective in order to calculate the information bottleneck is to find the optimal representation on W. This could in principle be arbitrarily complicated, but fortunately it is known that the cardinality of W can be restricted as |\mathcal{W}| \leq |\mathcal{X}| + 1 which makes the calculation possible for finite |\mathcal{X}|. Now, for many practical applications, e.g. in machine learning, X represents a potentially very large data space, while Y is from a comparably small set of labels. This raises the question whether the known cardinality bound can be improved in such situations. We show that the information bottleneck function can always be approximated up to an error \delta (\varepsilon,\;|\mathcal{Y}|) with a cardinality |\mathcal{W}| \leq f( \in,\;|\mathcal{Y}|), for explicitly given functions δ and f of an approximation parameter ϵ > 0 and the cardinality of \mathcal{Y}.Finally, we generalize the known cardinality boundsY to the case were some of the random variables represent quantum information.

AB - The information bottleneck function gives a measure of optimal preservation of correlation between some random variable X and some side information Y while compressing X into a new random variable W with bounded remaining correlation to X. As such, the information bottleneck has found many natural applications in machine learning, coding and video compression. The main objective in order to calculate the information bottleneck is to find the optimal representation on W. This could in principle be arbitrarily complicated, but fortunately it is known that the cardinality of W can be restricted as |\mathcal{W}| \leq |\mathcal{X}| + 1 which makes the calculation possible for finite |\mathcal{X}|. Now, for many practical applications, e.g. in machine learning, X represents a potentially very large data space, while Y is from a comparably small set of labels. This raises the question whether the known cardinality bound can be improved in such situations. We show that the information bottleneck function can always be approximated up to an error \delta (\varepsilon,\;|\mathcal{Y}|) with a cardinality |\mathcal{W}| \leq f( \in,\;|\mathcal{Y}|), for explicitly given functions δ and f of an approximation parameter ϵ > 0 and the cardinality of \mathcal{Y}.Finally, we generalize the known cardinality boundsY to the case were some of the random variables represent quantum information.

UR - http://www.scopus.com/inward/record.url?scp=85090401824&partnerID=8YFLogxK

U2 - 10.1109/ISIT44484.2020.9174416

DO - 10.1109/ISIT44484.2020.9174416

M3 - Conference contribution

AN - SCOPUS:85090401824

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 2383

EP - 2388

BT - 2020 IEEE International Symposium on Information Theory, ISIT 2020 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2020 IEEE International Symposium on Information Theory, ISIT 2020

Y2 - 21 July 2020 through 26 July 2020

ER -