Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZen, Remmyen_US
dc.contributor.authorMy, Longen_US
dc.contributor.authorTan, Ryanen_US
dc.contributor.authorHébert, Frédéricen_US
dc.contributor.authorGattobigio, Marioen_US
dc.contributor.authorMiniatura, Christianen_US
dc.contributor.authorPoletti, Darioen_US
dc.contributor.authorBressan, Stéphaneen_US
dc.identifier.citationZen, R., My, L., Tan, R., Hébert, F., Gattobigio, M., Miniatura, C., . . . Bressan, S. (2020). Transfer learning for scalability of neural-network quantum states. Physical Review E, 101(5), 053301-. doi:10.1103/physreve.101.053301en_US
dc.description.abstractNeural-network quantum states have shown great potential for the study of many-body quantum systems. In statistical machine learning, transfer learning designates protocols reusing features of a machine learning model trained for a problem to solve a possibly related but different problem. We propose to evaluate the potential of transfer learning to improve the scalability of neural-network quantum states. We devise and present physics-inspired transfer learning protocols, reusing the features of neural-network quantum states learned for the computation of the ground state of a small system for systems of larger sizes. We implement different protocols for restricted Boltzmann machines on general-purpose graphics processing units. This implementation alone yields a speedup over existing implementations on multicore and distributed central processing units in comparable settings. We empirically and comparatively evaluate the efficiency (time) and effectiveness (accuracy) of different transfer learning protocols as we scale the system size in different models and different quantum phases. Namely, we consider both the transverse field Ising and Heisenberg XXZ models in one dimension, as well as in two dimensions for the latter, with system sizes up to 128 and 8×8 spins. We empirically demonstrate that some of the transfer learning protocols that we have devised can be far more effective and efficient than starting from neural-network quantum states with randomly initialized parameters.en_US
dc.description.sponsorshipNational Supercomputing Centre (NSCC) Singaporeen_US
dc.relation.ispartofPhysical Review Een_US
dc.rights© 2020 American Physical Society. All rights reserved. This paper was published in Physical Review E and is made available with permission of American Physical Society.en_US
dc.titleTransfer learning for scalability of neural-network quantum statesen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Physical and Mathematical Sciencesen_US
dc.description.versionPublished versionen_US
dc.subject.keywordsQuantum Statistical Mechanicsen_US
dc.subject.keywordsQuantum Spin Modelsen_US
dc.description.acknowledgementWe acknowledge C. Guo and Supremacy Future Technologies for support on the matrix product state simulations. This work was partially funded by the National University of Singapore, the French Ministry of European and Foreign Affairs, and the French Ministry of Higher Education, Research and Innovation under the Merlion program as Merlion Project “Deep Quantum.” Some of the experiments reported in this article were performed on the infrastructure of Singapore National Supercomputing Centre and were funded under project “Computing the Deep Quantum.”en_US
item.fulltextWith Fulltext-
Appears in Collections:SPMS Journal Articles
Files in This Item:
File Description SizeFormat 
PhysRevE.101.053301.pdf1.88 MBAdobe PDFThumbnail

Citations 20

Updated on Sep 24, 2023

Web of ScienceTM
Citations 10

Updated on Sep 24, 2023

Page view(s)

Updated on Sep 26, 2023

Download(s) 50

Updated on Sep 26, 2023

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.