Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/163377
Title: Semi-supervised federated heterogeneous transfer learning
Authors: Feng, Siwei
Li, Boyang
Yu, Han
Liu, Yang
Yang, Qiang
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: Feng, S., Li, B., Yu, H., Liu, Y. & Yang, Q. (2022). Semi-supervised federated heterogeneous transfer learning. Knowledge-Based Systems, 252, 109384-. https://dx.doi.org/10.1016/j.knosys.2022.109384
Project: AISG2-RP-2020-019
A20G8b0102
NWJ-2020-008
NSC-2019-011
Journal: Knowledge-Based Systems
Abstract: Federated learning (FL) is a privacy-preserving paradigm that collaboratively train machine learning models with distributed data stored in different silos without exposing sensitive information. Different from most existing FL approaches requiring data from different parties share either the same feature space or sample ID space, federated transfer learning (FTL), which is a recently proposed FL concept, is designed for situations where data from different parties differ not only in samples but also in feature space. However, like most traditional FL approaches, FTL methods also suffer from issues caused by insufficiency of overlapping data. In this paper, we propose a novel FTL framework referred to as Semi-Supervised Federated Heterogeneous Transfer Learning (SFHTL) to leverage on the unlabeled non-overlapping samples to reduce model overfitting as a result of insufficient overlapping training samples in FL scenarios. Unlike existing FTL approaches, SFHTL makes use of non-overlapping samples from all parties to expand the training set for each party to improve local model performance. Through extensive experimental evaluation based on real-world datasets, we demonstrate significant advantages of SFHTL over state-of-the-art approaches.
URI: https://hdl.handle.net/10356/163377
ISSN: 0950-7051
DOI: 10.1016/j.knosys.2022.109384
Rights: © 2022 Elsevier B.V. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 50

2
Updated on Jan 29, 2023

Web of ScienceTM
Citations 50

1
Updated on Jan 28, 2023

Page view(s)

19
Updated on Feb 3, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.