Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/180139
Title: | pFedKT: personalized federated learning with dual knowledge transfer | Authors: | Yi, Liping Shi, Xiaorong Wang, Nan Wang, Gang Liu, Xiaoguang Shi, Zhuan Yu, Han |
Keywords: | Computer and Information Science | Issue Date: | 2024 | Source: | Yi, L., Shi, X., Wang, N., Wang, G., Liu, X., Shi, Z. & Yu, H. (2024). pFedKT: personalized federated learning with dual knowledge transfer. Knowledge-Based Systems, 292, 111633-. https://dx.doi.org/10.1016/j.knosys.2024.111633 | Project: | AISG2-RP-2020-019 A20G8b0102 |
Journal: | Knowledge-Based Systems | Abstract: | Federated learning (FL) has been widely studied as an emerging privacy-preserving machine learning paradigm for achieving multi-party collaborative model training on decentralized data. In practice, such data tend to follow non-independent and identically distributed (non-IID) data distributions. Thus, the performance of models obtained through vanilla horizontal FL tends to vary significantly across FL clients. To tackle this challenge, a new subfield of FL – personalized federated learning (PFL) – has emerged for producing personalized FL models that can perform well on diverse local datasets. Existing PFL approaches are limited in terms of effectively transferring knowledge among clients to improve model generalization while achieving good performance on diverse local datasets. To bridge this important gap, we propose the personalized Federated Knowledge Transfer (pFedKT) approach. It involves dual knowledge transfer: (1) transferring historical local knowledge to local models via local hypernetworks; and (2) transferring latest global knowledge to local models through contrastive learning. By fusing historical local knowledge and the latest global knowledge, the personalization and generalization of individual models for FL clients can be simultaneously enhanced. We provide theoretical analysis on the generalization and convergence of pFedKT. Extensive experiments on 3 real-world datasets demonstrate that pFedKT achieves 0.74%–1.62% higher test accuracy compared to 14 state-of-the-art baselines. | URI: | https://hdl.handle.net/10356/180139 | ISSN: | 0950-7051 | DOI: | 10.1016/j.knosys.2024.111633 | Schools: | School of Computer Science and Engineering | Rights: | © 2024 Elsevier B.V. All rights reserved. | Fulltext Permission: | none | Fulltext Availability: | No Fulltext |
Appears in Collections: | SCSE Journal Articles |
SCOPUSTM
Citations
50
4
Updated on Jan 15, 2025
Page view(s)
40
Updated on Jan 15, 2025
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.