Details
Original language | English |
---|---|
Article number | 045025 |
Journal | Machine Learning: Science and Technology |
Volume | 3 |
Issue number | 4 |
Publication status | Published - 16 Dec 2022 |
Externally published | Yes |
Abstract
To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.
Keywords
- Bloch vector, neural networks, quantum physics, reinforcement learning, representation learning
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Computer Science(all)
- Human-Computer Interaction
- Computer Science(all)
- Artificial Intelligence
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Machine Learning: Science and Technology, Vol. 3, No. 4, 045025, 16.12.2022.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Operationally meaningful representations of physical systems in neural networks
AU - Poulsen Nautrup, Hendrik
AU - Metger, Tony
AU - Iten, Raban
AU - Jerbi, Sofiene
AU - Trenkwalder, Lea M.
AU - Wilming, Henrik
AU - Briegel, Hans J.
AU - Renner, Renato
N1 - Funding Information: H P N, S J, L M T and H J B acknowledge support from the Austrian Science Fund (FWF) through the DK-ALM: W1259-N27 and SFB BeyondC F7102. R I, H W and R R acknowledge support from from the Swiss National Science Foundation through SNSF Project No. 200020_165843 and 200021_188541. T M acknowledges support from ETH Zürich and the ETH Foundation through the Excellence Scholarship & Opportunity Programme, and from the IQIM, an NSF Physics Frontiers Center (NSF Grant PHY-1125565) with support of the Gordon and Betty Moore Foundation (GBMF-12500028). S J also acknowledges the Austrian Academy of Sciences as a recipient of the DOC Fellowship. H J B acknowledges support by the Ministerium für Wissenschaft, Forschung, und Kunst Baden Württemberg (AZ:33-7533.-30-10/41/1) and by the European Research Council (ERC) under Project No. 101055129. This work was supported by the Swiss National Supercomputing Centre (CSCS) under Project ID da04.
PY - 2022/12/16
Y1 - 2022/12/16
N2 - To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.
AB - To make progress in science, we often build abstract representations of physical systems that meaningfully encode information about the systems. Such representations ignore redundant features and treat parameters such as velocity and position separately because they can be useful for making statements about different experimental settings. Here, we capture this notion by formally defining the concept of operationally meaningful representations. We present an autoencoder architecture with attention mechanism that can generate such representations and demonstrate it on examples involving both classical and quantum physics. For instance, our architecture finds a compact representation of an arbitrary two-qubit system that separates local parameters from parameters describing quantum correlations.
KW - Bloch vector
KW - neural networks
KW - quantum physics
KW - reinforcement learning
KW - representation learning
UR - http://www.scopus.com/inward/record.url?scp=85145253673&partnerID=8YFLogxK
U2 - 10.1088/2632-2153/ac9ae8
DO - 10.1088/2632-2153/ac9ae8
M3 - Article
AN - SCOPUS:85145253673
VL - 3
JO - Machine Learning: Science and Technology
JF - Machine Learning: Science and Technology
IS - 4
M1 - 045025
ER -