Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/172524
Title: Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
Authors: Liu, Ziyao
Guo, Jiale
Yang, Mengmeng
Yang, Wenzhuo
Fan, Jiani
Lam, Kwok-Yan
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2023
Source: Liu, Z., Guo, J., Yang, M., Yang, W., Fan, J. & Lam, K. (2023). Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles. 2023 Secure and Trustworthy Deep Learning Systems Workshop (SecTL'23). https://dx.doi.org/10.1145/3591197.3591303
Conference: 2023 Secure and Trustworthy Deep Learning Systems Workshop (SecTL'23)
Abstract: Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher Ensembles (PATE) scheme is one promising approach to address this privacy concern while supporting knowledge transfer from an ensemble of "teacher"models to a "student"model under the coordination of an aggregator. To further protect the data privacy of the student node, the privacy-enhanced version of PATE makes use of cryptographic techniques at the expense of heavy computation overheads at the teacher nodes. However, this inevitably hinders the adoption of knowledge transfer due to the highly disparate computational capability of teachers. Besides, in real-life systems, participating teachers may drop out of the system at any time, which causes new security risks for adopted cryptographic building blocks. Thus, it is desirable to devise privacy-enhanced knowledge transfer that can run on teacher nodes with relatively fewer computational resources and can preserve privacy with dropped teacher nodes. In this connection, we propose a dropout-resilient and privacy-enhanced knowledge transfer scheme, Collaborative Split learning over Teacher Ensembles (CSTE), that supports the participating teacher nodes to train and infer their local models using split learning. CSTE not only allows the compute-intensive processing to be performed at a split learning server, but also protects the data privacy of teacher nodes from collusion between the student node and aggregator. Experimental results showed that CSTE achieves significant efficiency improvement from existing schemes.
URI: https://hdl.handle.net/10356/172524
ISBN: 9798400701818
DOI: 10.1145/3591197.3591303
Schools: School of Computer Science and Engineering 
Research Centres: Strategic Centre for Research in Privacy-Preserving Technologies & Systems
Rights: © 2023 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution International 4.0 License.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Conference Papers

Files in This Item:
File Description SizeFormat 
3591197.3591303.pdf2.01 MBAdobe PDFThumbnail
View/Open

Page view(s)

122
Updated on May 7, 2025

Download(s) 50

52
Updated on May 7, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.