Transfer learning for scalability of neural-network quantum states

Neural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propos...

Full description

Saved in:
Bibliographic Details
Main Authors: Zen, Remmy, My, Long, Tan, Ryan, Hébert, Frédéric, Gattobigio, Mario, Miniatura, Christian, Poletti, Dario, Bressan, Stéphane
Other Authors: School of Physical and Mathematical Sciences
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/146572
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-146572
record_format dspace
spelling sg-ntu-dr.10356-1465722023-02-28T19:55:27Z Transfer learning for scalability of neural-network quantum states Zen, Remmy My, Long Tan, Ryan Hébert, Frédéric Gattobigio, Mario Miniatura, Christian Poletti, Dario Bressan, Stéphane School of Physical and Mathematical Sciences MajuLab@NTU Science::Physics Quantum Statistical Mechanics Quantum Spin Models Neural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propose to evaluate the potential of transfer learning to improve the scalability of neural-network quantum states. We devise and present physics-inspired transfer learning protocols, reusing the features of neural-network quantum states learned for the computation of the ground state of a small system for systems of larger sizes. We implement different protocols for restricted Boltzmann machines on general-purpose graphics processing units. This implementation alone yields a speedup over existing implementations on multicore and distributed central processing units in comparable settings. We empirically and comparatively evaluate the efficiency (time) and effectiveness (accuracy) of different transfer learning protocols as we scale the system size in different models and different quantum phases. Namely, we consider both the transverse field Ising and Heisenberg XXZ models in one dimension, as well as in two dimensions for the latter, with system sizes up to 128 and 8×8 spins. We empirically demonstrate that some of the transfer learning protocols that we have devised can be far more effective and efficient than starting from neural-network quantum states with randomly initialized parameters. National Supercomputing Centre (NSCC) Singapore Published version We acknowledge C. Guo and Supremacy Future Technologies for support on the matrix product state simulations. This work was partially funded by the National University of Singapore, the French Ministry of European and Foreign Affairs, and the French Ministry of Higher Education, Research and Innovation under the Merlion program as Merlion Project “Deep Quantum.” Some of the experiments reported in this article were performed on the infrastructure of Singapore National Supercomputing Centre and were funded under project “Computing the Deep Quantum.” 2021-03-02T02:03:02Z 2021-03-02T02:03:02Z 2020 Journal Article Zen, R., My, L., Tan, R., Hébert, F., Gattobigio, M., Miniatura, C., . . . Bressan, S. (2020). Transfer learning for scalability of neural-network quantum states. Physical Review E, 101(5), 053301-. doi:10.1103/physreve.101.053301 2470-0045 https://hdl.handle.net/10356/146572 10.1103/PhysRevE.101.053301 32575207 2-s2.0-85086313225 5 101 en Physical Review E © 2020 American Physical Society. All rights reserved. This paper was published in Physical Review E and is made available with permission of American Physical Society. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Science::Physics
Quantum Statistical Mechanics
Quantum Spin Models
spellingShingle Science::Physics
Quantum Statistical Mechanics
Quantum Spin Models
Zen, Remmy
My, Long
Tan, Ryan
Hébert, Frédéric
Gattobigio, Mario
Miniatura, Christian
Poletti, Dario
Bressan, Stéphane
Transfer learning for scalability of neural-network quantum states
description Neural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propose to evaluate the potential of transfer learning to improve the scalability of neural-network quantum states. We devise and present physics-inspired transfer learning protocols, reusing the features of neural-network quantum states learned for the computation of the ground state of a small system for systems of larger sizes. We implement different protocols for restricted Boltzmann machines on general-purpose graphics processing units. This implementation alone yields a speedup over existing implementations on multicore and distributed central processing units in comparable settings. We empirically and comparatively evaluate the efficiency (time) and effectiveness (accuracy) of different transfer learning protocols as we scale the system size in different models and different quantum phases. Namely, we consider both the transverse field Ising and Heisenberg XXZ models in one dimension, as well as in two dimensions for the latter, with system sizes up to 128 and 8×8 spins. We empirically demonstrate that some of the transfer learning protocols that we have devised can be far more effective and efficient than starting from neural-network quantum states with randomly initialized parameters.
author2 School of Physical and Mathematical Sciences
author_facet School of Physical and Mathematical Sciences
Zen, Remmy
My, Long
Tan, Ryan
Hébert, Frédéric
Gattobigio, Mario
Miniatura, Christian
Poletti, Dario
Bressan, Stéphane
format Article
author Zen, Remmy
My, Long
Tan, Ryan
Hébert, Frédéric
Gattobigio, Mario
Miniatura, Christian
Poletti, Dario
Bressan, Stéphane
author_sort Zen, Remmy
title Transfer learning for scalability of neural-network quantum states
title_short Transfer learning for scalability of neural-network quantum states
title_full Transfer learning for scalability of neural-network quantum states
title_fullStr Transfer learning for scalability of neural-network quantum states
title_full_unstemmed Transfer learning for scalability of neural-network quantum states
title_sort transfer learning for scalability of neural-network quantum states
publishDate 2021
url https://hdl.handle.net/10356/146572
_version_ 1759853378247065600