Privacy-preserving federating learning with differential privacy

Federated Learning represents a cutting-edge AI approach, facilitating collaborative model training across distributed devices, with applications spanning various sectors. For instance, in healthcare, institutions ollaborate to predict diseases while ensuring patient data remains decentralized. S...

Full description

Saved in:
Bibliographic Details
Main Author: Qi, Kehu
Other Authors: Zhang Tianwei
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/175164
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-175164
record_format dspace
spelling sg-ntu-dr.10356-1751642024-04-26T15:41:25Z Privacy-preserving federating learning with differential privacy Qi, Kehu Zhang Tianwei School of Computer Science and Engineering tianwei.zhang@ntu.edu.sg Computer and Information Science Federated Learning represents a cutting-edge AI approach, facilitating collaborative model training across distributed devices, with applications spanning various sectors. For instance, in healthcare, institutions ollaborate to predict diseases while ensuring patient data remains decentralized. Similarly, smartphones and IoT devices enhance services like predictive text and voice recognition collectively, preserving sensitive information. Despite the federated learning framework, data privacy remains vulnerable to attacks such as model inversion or membership inference. Thus, Differential Privacy, a cryptographic technique, becomes imperative for establishing a robust and secure federated learning environment. This paper focus on studying the effectiveness of implementing differential privacy in federated learning to safeguard data privacy. Bachelor's degree 2024-04-22T05:10:23Z 2024-04-22T05:10:23Z 2024 Final Year Project (FYP) Qi, K. (2024). Privacy-preserving federating learning with differential privacy. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175164 https://hdl.handle.net/10356/175164 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
spellingShingle Computer and Information Science
Qi, Kehu
Privacy-preserving federating learning with differential privacy
description Federated Learning represents a cutting-edge AI approach, facilitating collaborative model training across distributed devices, with applications spanning various sectors. For instance, in healthcare, institutions ollaborate to predict diseases while ensuring patient data remains decentralized. Similarly, smartphones and IoT devices enhance services like predictive text and voice recognition collectively, preserving sensitive information. Despite the federated learning framework, data privacy remains vulnerable to attacks such as model inversion or membership inference. Thus, Differential Privacy, a cryptographic technique, becomes imperative for establishing a robust and secure federated learning environment. This paper focus on studying the effectiveness of implementing differential privacy in federated learning to safeguard data privacy.
author2 Zhang Tianwei
author_facet Zhang Tianwei
Qi, Kehu
format Final Year Project
author Qi, Kehu
author_sort Qi, Kehu
title Privacy-preserving federating learning with differential privacy
title_short Privacy-preserving federating learning with differential privacy
title_full Privacy-preserving federating learning with differential privacy
title_fullStr Privacy-preserving federating learning with differential privacy
title_full_unstemmed Privacy-preserving federating learning with differential privacy
title_sort privacy-preserving federating learning with differential privacy
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/175164
_version_ 1800916341092253696