Privacy-preserving federating learning with differential privacy

Federated Learning represents a cutting-edge AI approach, facilitating collaborative model training across distributed devices, with applications spanning various sectors. For instance, in healthcare, institutions ollaborate to predict diseases while ensuring patient data remains decentralized. S...

Full description

Saved in:
Bibliographic Details
Main Author: Qi, Kehu
Other Authors: Zhang Tianwei
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/175164
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Federated Learning represents a cutting-edge AI approach, facilitating collaborative model training across distributed devices, with applications spanning various sectors. For instance, in healthcare, institutions ollaborate to predict diseases while ensuring patient data remains decentralized. Similarly, smartphones and IoT devices enhance services like predictive text and voice recognition collectively, preserving sensitive information. Despite the federated learning framework, data privacy remains vulnerable to attacks such as model inversion or membership inference. Thus, Differential Privacy, a cryptographic technique, becomes imperative for establishing a robust and secure federated learning environment. This paper focus on studying the effectiveness of implementing differential privacy in federated learning to safeguard data privacy.