Operationally meaningful representations of physical systems in neural networks

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autoren

  • Hendrik Poulsen Nautrup
  • Tony Metger
  • Raban Iten
  • Sofiene Jerbi
  • Lea M. Trenkwalder
  • Henrik Wilming
  • Hans J. Briegel
  • Renato Renner

Externe Organisationen

  • Universität Innsbruck
  • ETH Zürich
  • Universität Konstanz
Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Aufsatznummer045025
FachzeitschriftMachine Learning: Science and Technology
Jahrgang3
Ausgabenummer4
PublikationsstatusVeröffentlicht - 16 Dez. 2022
Extern publiziertJa

Abstract

To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.

ASJC Scopus Sachgebiete

Zitieren

Operationally meaningful representations of physical systems in neural networks. / Poulsen Nautrup, Hendrik; Metger, Tony; Iten, Raban et al.
in: Machine Learning: Science and Technology, Jahrgang 3, Nr. 4, 045025, 16.12.2022.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Poulsen Nautrup, H, Metger, T, Iten, R, Jerbi, S, Trenkwalder, LM, Wilming, H, Briegel, HJ & Renner, R 2022, 'Operationally meaningful representations of physical systems in neural networks', Machine Learning: Science and Technology, Jg. 3, Nr. 4, 045025. https://doi.org/10.1088/2632-2153/ac9ae8
Poulsen Nautrup, H., Metger, T., Iten, R., Jerbi, S., Trenkwalder, L. M., Wilming, H., Briegel, H. J., & Renner, R. (2022). Operationally meaningful representations of physical systems in neural networks. Machine Learning: Science and Technology, 3(4), Artikel 045025. https://doi.org/10.1088/2632-2153/ac9ae8
Poulsen Nautrup H, Metger T, Iten R, Jerbi S, Trenkwalder LM, Wilming H et al. Operationally meaningful representations of physical systems in neural networks. Machine Learning: Science and Technology. 2022 Dez 16;3(4):045025. doi: 10.1088/2632-2153/ac9ae8
Poulsen Nautrup, Hendrik ; Metger, Tony ; Iten, Raban et al. / Operationally meaningful representations of physical systems in neural networks. in: Machine Learning: Science and Technology. 2022 ; Jahrgang 3, Nr. 4.
Download
@article{b1a7b1f41ac548e28a1c7e069a3d4a28,
title = "Operationally meaningful representations of physical systems in neural networks",
abstract = "To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.",
keywords = "Bloch vector, neural networks, quantum physics, reinforcement learning, representation learning",
author = "{Poulsen Nautrup}, Hendrik and Tony Metger and Raban Iten and Sofiene Jerbi and Trenkwalder, {Lea M.} and Henrik Wilming and Briegel, {Hans J.} and Renato Renner",
note = "Funding Information: H P N, S J, L M T and H J B acknowledge support from the Austrian Science Fund (FWF) through the DK-ALM: W1259-N27 and SFB BeyondC F7102. R I, H W and R R acknowledge support from from the Swiss National Science Foundation through SNSF Project No. 200020_165843 and 200021_188541. T M acknowledges support from ETH Z{\"u}rich and the ETH Foundation through the Excellence Scholarship & Opportunity Programme, and from the IQIM, an NSF Physics Frontiers Center (NSF Grant PHY-1125565) with support of the Gordon and Betty Moore Foundation (GBMF-12500028). S J also acknowledges the Austrian Academy of Sciences as a recipient of the DOC Fellowship. H J B acknowledges support by the Ministerium f{\"u}r Wissenschaft, Forschung, und Kunst Baden W{\"u}rttemberg (AZ:33-7533.-30-10/41/1) and by the European Research Council (ERC) under Project No. 101055129. This work was supported by the Swiss National Supercomputing Centre (CSCS) under Project ID da04. ",
year = "2022",
month = dec,
day = "16",
doi = "10.1088/2632-2153/ac9ae8",
language = "English",
volume = "3",
number = "4",

}

Download

TY - JOUR

T1 - Operationally meaningful representations of physical systems in neural networks

AU - Poulsen Nautrup, Hendrik

AU - Metger, Tony

AU - Iten, Raban

AU - Jerbi, Sofiene

AU - Trenkwalder, Lea M.

AU - Wilming, Henrik

AU - Briegel, Hans J.

AU - Renner, Renato

N1 - Funding Information: H P N, S J, L M T and H J B acknowledge support from the Austrian Science Fund (FWF) through the DK-ALM: W1259-N27 and SFB BeyondC F7102. R I, H W and R R acknowledge support from from the Swiss National Science Foundation through SNSF Project No. 200020_165843 and 200021_188541. T M acknowledges support from ETH Zürich and the ETH Foundation through the Excellence Scholarship & Opportunity Programme, and from the IQIM, an NSF Physics Frontiers Center (NSF Grant PHY-1125565) with support of the Gordon and Betty Moore Foundation (GBMF-12500028). S J also acknowledges the Austrian Academy of Sciences as a recipient of the DOC Fellowship. H J B acknowledges support by the Ministerium für Wissenschaft, Forschung, und Kunst Baden Württemberg (AZ:33-7533.-30-10/41/1) and by the European Research Council (ERC) under Project No. 101055129. This work was supported by the Swiss National Supercomputing Centre (CSCS) under Project ID da04.

PY - 2022/12/16

Y1 - 2022/12/16

N2 - To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.

AB - To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.

KW - Bloch vector

KW - neural networks

KW - quantum physics

KW - reinforcement learning

KW - representation learning

UR - http://www.scopus.com/inward/record.url?scp=85145253673&partnerID=8YFLogxK

U2 - 10.1088/2632-2153/ac9ae8

DO - 10.1088/2632-2153/ac9ae8

M3 - Article

AN - SCOPUS:85145253673

VL - 3

JO - Machine Learning: Science and Technology

JF - Machine Learning: Science and Technology

IS - 4

M1 - 045025

ER -

Von denselben Autoren