Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/180139
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYi, Lipingen_US
dc.contributor.authorShi, Xiaorongen_US
dc.contributor.authorWang, Nanen_US
dc.contributor.authorWang, Gangen_US
dc.contributor.authorLiu, Xiaoguangen_US
dc.contributor.authorShi, Zhuanen_US
dc.contributor.authorYu, Hanen_US
dc.date.accessioned2024-09-18T07:09:58Z-
dc.date.available2024-09-18T07:09:58Z-
dc.date.issued2024-
dc.identifier.citationYi, L., Shi, X., Wang, N., Wang, G., Liu, X., Shi, Z. & Yu, H. (2024). pFedKT: personalized federated learning with dual knowledge transfer. Knowledge-Based Systems, 292, 111633-. https://dx.doi.org/10.1016/j.knosys.2024.111633en_US
dc.identifier.issn0950-7051en_US
dc.identifier.urihttps://hdl.handle.net/10356/180139-
dc.description.abstractFederated learning (FL) has been widely studied as an emerging privacy-preserving machine learning paradigm for achieving multi-party collaborative model training on decentralized data. In practice, such data tend to follow non-independent and identically distributed (non-IID) data distributions. Thus, the performance of models obtained through vanilla horizontal FL tends to vary significantly across FL clients. To tackle this challenge, a new subfield of FL – personalized federated learning (PFL) – has emerged for producing personalized FL models that can perform well on diverse local datasets. Existing PFL approaches are limited in terms of effectively transferring knowledge among clients to improve model generalization while achieving good performance on diverse local datasets. To bridge this important gap, we propose the personalized Federated Knowledge Transfer (pFedKT) approach. It involves dual knowledge transfer: (1) transferring historical local knowledge to local models via local hypernetworks; and (2) transferring latest global knowledge to local models through contrastive learning. By fusing historical local knowledge and the latest global knowledge, the personalization and generalization of individual models for FL clients can be simultaneously enhanced. We provide theoretical analysis on the generalization and convergence of pFedKT. Extensive experiments on 3 real-world datasets demonstrate that pFedKT achieves 0.74%–1.62% higher test accuracy compared to 14 state-of-the-art baselines.en_US
dc.description.sponsorshipAgency for Science, Technology and Research (A*STAR)en_US
dc.description.sponsorshipNational Research Foundation (NRF)en_US
dc.language.isoenen_US
dc.relationAISG2-RP-2020-019en_US
dc.relationA20G8b0102en_US
dc.relation.ispartofKnowledge-Based Systemsen_US
dc.rights© 2024 Elsevier B.V. All rights reserved.en_US
dc.subjectComputer and Information Scienceen_US
dc.titlepFedKT: personalized federated learning with dual knowledge transferen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.identifier.doi10.1016/j.knosys.2024.111633-
dc.identifier.scopus2-s2.0-85187792396-
dc.identifier.volume292en_US
dc.identifier.spage111633en_US
dc.subject.keywordsPersonalized federated learningen_US
dc.subject.keywordsKnowledge transferen_US
dc.description.acknowledgementThis research is supported in part by the National Science Foundation of China under Grant 62272252 and 62272253, the Key Research and Development Program of Guangdong under Grant 2021B01013 10002, and the Fundamental Research Funds for the Central Universities; the National Research Foundation Singapore and DSO National Laboratories under the AI Singapore Programme (AISG Award No: AISG2-RP-2020-019); the RIE 2020 Advanced Manufacturing and Engineering (AME) Programmatic Fund (No. A20G8b0102), Singapore.en_US
item.fulltextNo Fulltext-
item.grantfulltextnone-
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 50

4
Updated on Feb 7, 2025

Page view(s)

54
Updated on Feb 11, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.