Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/146572
Title: | Transfer learning for scalability of neural-network quantum states | Authors: | Zen, Remmy My, Long Tan, Ryan Hébert, Frédéric Gattobigio, Mario Miniatura, Christian Poletti, Dario Bressan, Stéphane |
Keywords: | Science::Physics | Issue Date: | 2020 | Source: | Zen, R., My, L., Tan, R., Hébert, F., Gattobigio, M., Miniatura, C., . . . Bressan, S. (2020). Transfer learning for scalability of neural-network quantum states. Physical Review E, 101(5), 053301-. doi:10.1103/physreve.101.053301 | Journal: | Physical Review E | Abstract: | Neural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propose to evaluate the potential of transfer learning to improve the scalability of neural-network quantum states. We devise and present physics-inspired transfer learning protocols, reusing the features of neural-network quantum states learned for the computation of the ground state of a small system for systems of larger sizes. We implement different protocols for restricted Boltzmann machines on general-purpose graphics processing units. This implementation alone yields a speedup over existing implementations on multicore and distributed central processing units in comparable settings. We empirically and comparatively evaluate the efficiency (time) and effectiveness (accuracy) of different transfer learning protocols as we scale the system size in different models and different quantum phases. Namely, we consider both the transverse field Ising and Heisenberg XXZ models in one dimension, as well as in two dimensions for the latter, with system sizes up to 128 and 8×8 spins. We empirically demonstrate that some of the transfer learning protocols that we have devised can be far more effective and efficient than starting from neural-network quantum states with randomly initialized parameters. | URI: | https://hdl.handle.net/10356/146572 | ISSN: | 2470-0045 | DOI: | 10.1103/PhysRevE.101.053301 | Schools: | School of Physical and Mathematical Sciences | Organisations: | MajuLab@NTU | Rights: | © 2020 American Physical Society. All rights reserved. This paper was published in Physical Review E and is made available with permission of American Physical Society. | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | SPMS Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
PhysRevE.101.053301.pdf | 1.88 MB | Adobe PDF | ![]() View/Open |
SCOPUSTM
Citations
20
33
Updated on Mar 9, 2025
Web of ScienceTM
Citations
10
24
Updated on Oct 25, 2023
Page view(s)
304
Updated on Mar 15, 2025
Download(s) 50
190
Updated on Mar 15, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.