Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles

Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher...

Full description

Saved in:
Bibliographic Details
Main Authors: Liu, Ziyao, Guo, Jiale, Yang, Mengmeng, Yang, Wenzhuo, Fan, Jiani, Lam, Kwok-Yan
Other Authors: School of Computer Science and Engineering
Format: Conference or Workshop Item
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172524
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-172524
record_format dspace
spelling sg-ntu-dr.10356-1725242023-12-15T15:36:19Z Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles Liu, Ziyao Guo, Jiale Yang, Mengmeng Yang, Wenzhuo Fan, Jiani Lam, Kwok-Yan School of Computer Science and Engineering 2023 Secure and Trustworthy Deep Learning Systems Workshop (SecTL'23) Strategic Centre for Research in Privacy-Preserving Technologies & Systems Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Privacy-Preservation Knowledge Transfer Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher Ensembles (PATE) scheme is one promising approach to address this privacy concern while supporting knowledge transfer from an ensemble of "teacher"models to a "student"model under the coordination of an aggregator. To further protect the data privacy of the student node, the privacy-enhanced version of PATE makes use of cryptographic techniques at the expense of heavy computation overheads at the teacher nodes. However, this inevitably hinders the adoption of knowledge transfer due to the highly disparate computational capability of teachers. Besides, in real-life systems, participating teachers may drop out of the system at any time, which causes new security risks for adopted cryptographic building blocks. Thus, it is desirable to devise privacy-enhanced knowledge transfer that can run on teacher nodes with relatively fewer computational resources and can preserve privacy with dropped teacher nodes. In this connection, we propose a dropout-resilient and privacy-enhanced knowledge transfer scheme, Collaborative Split learning over Teacher Ensembles (CSTE), that supports the participating teacher nodes to train and infer their local models using split learning. CSTE not only allows the compute-intensive processing to be performed at a split learning server, but also protects the data privacy of teacher nodes from collusion between the student node and aggregator. Experimental results showed that CSTE achieves significant efficiency improvement from existing schemes. National Research Foundation (NRF) Published version This research is supported by the National Research Foundation, Singapore under its Strategic Capability Research Centres Funding Initiative. 2023-12-13T07:05:37Z 2023-12-13T07:05:37Z 2023 Conference Paper Liu, Z., Guo, J., Yang, M., Yang, W., Fan, J. & Lam, K. (2023). Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles. 2023 Secure and Trustworthy Deep Learning Systems Workshop (SecTL'23). https://dx.doi.org/10.1145/3591197.3591303 9798400701818 https://hdl.handle.net/10356/172524 10.1145/3591197.3591303 2-s2.0-85168555484 en © 2023 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution International 4.0 License. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Privacy-Preservation
Knowledge Transfer
spellingShingle Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Privacy-Preservation
Knowledge Transfer
Liu, Ziyao
Guo, Jiale
Yang, Mengmeng
Yang, Wenzhuo
Fan, Jiani
Lam, Kwok-Yan
Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
description Knowledge Transfer has received much attention for its ability to transfer knowledge, rather than data, from one application task to another. In order to comply with the stringent data privacy regulations, privacy-preserving knowledge transfer is highly desirable. The Private Aggregation of Teacher Ensembles (PATE) scheme is one promising approach to address this privacy concern while supporting knowledge transfer from an ensemble of "teacher"models to a "student"model under the coordination of an aggregator. To further protect the data privacy of the student node, the privacy-enhanced version of PATE makes use of cryptographic techniques at the expense of heavy computation overheads at the teacher nodes. However, this inevitably hinders the adoption of knowledge transfer due to the highly disparate computational capability of teachers. Besides, in real-life systems, participating teachers may drop out of the system at any time, which causes new security risks for adopted cryptographic building blocks. Thus, it is desirable to devise privacy-enhanced knowledge transfer that can run on teacher nodes with relatively fewer computational resources and can preserve privacy with dropped teacher nodes. In this connection, we propose a dropout-resilient and privacy-enhanced knowledge transfer scheme, Collaborative Split learning over Teacher Ensembles (CSTE), that supports the participating teacher nodes to train and infer their local models using split learning. CSTE not only allows the compute-intensive processing to be performed at a split learning server, but also protects the data privacy of teacher nodes from collusion between the student node and aggregator. Experimental results showed that CSTE achieves significant efficiency improvement from existing schemes.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Liu, Ziyao
Guo, Jiale
Yang, Mengmeng
Yang, Wenzhuo
Fan, Jiani
Lam, Kwok-Yan
format Conference or Workshop Item
author Liu, Ziyao
Guo, Jiale
Yang, Mengmeng
Yang, Wenzhuo
Fan, Jiani
Lam, Kwok-Yan
author_sort Liu, Ziyao
title Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
title_short Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
title_full Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
title_fullStr Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
title_full_unstemmed Privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
title_sort privacy-enhanced knowledge transfer with collaborative split learning over teacher ensembles
publishDate 2023
url https://hdl.handle.net/10356/172524
_version_ 1787136486280265728