Privacy-preserving deep learning
Data is coined to be the new oil due to the increasing awareness of its value in a myriad of applications running the gamut from automating personalised services to artificial intelligence - all of which with machine learning (ML) at their core. With this rising trend, there is growing attention o...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/151686 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-151686 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1516862021-07-04T20:10:25Z Privacy-preserving deep learning Yik, Jia Ler Tan, Zhen Yuan Zaw, Maw Htun Lam Kwok Yan Renaissance Engineering Programme kwokyan.lam@ntu.edu.sg Engineering::Computer science and engineering Data is coined to be the new oil due to the increasing awareness of its value in a myriad of applications running the gamut from automating personalised services to artificial intelligence - all of which with machine learning (ML) at their core. With this rising trend, there is growing attention on privacy by consumers and government bodies; this introduces the need for Federated Learning (FL) and Differential Privacy (DP) - an evolved form of ML, where models are trained while privacy is safeguarded - which forms the focus of our research. We visited existing research developments in privacy-preserving deep learning applications on structured and unstructured data and designed a proof-of-concept platform for the same, in the form of a Convolutional Neural Network for MNIST dataset handwritten digits hosted on the Cloud. Our experiment structure tested the different permutations between the degree of training in models, determined by the number of epochs per generation, and whether DP was implemented. In particular, our findings indicated the following: 1, adding noise to trained weights resulted in an overall decrease in trained accuracy but greater epsilon value; 2, larger locally trained accuracy for a larger epoch run presented itself with a larger accuracy drop; 3, lower final validation accuracy was achieved for DP models; 4, there was a low correlation between final validation accuracy values with standard deviation regardless of DP model. Further research can be conducted on the differing FL structures and centrality. Although FL is relatively new, there is strong evidence to suggest a growing interest and attention towards it. We hold the opinion that FL has a place in collaborative ML-based applications while preserving the privacy of end-users. Bachelor of Engineering Science (Computer Science) Bachelor of Engineering Science (Materials Engineering) Bachelor of Engineering Science (Mechanical Engineering) 2021-06-28T03:05:13Z 2021-06-28T03:05:13Z 2021 Final Year Project (FYP) Yik, J. L., Tan, Z. Y. & Zaw, M. H. (2021). Privacy-preserving deep learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/151686 https://hdl.handle.net/10356/151686 en application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering |
spellingShingle |
Engineering::Computer science and engineering Yik, Jia Ler Tan, Zhen Yuan Zaw, Maw Htun Privacy-preserving deep learning |
description |
Data is coined to be the new oil due to the increasing awareness of its value in a myriad of applications running the gamut from automating personalised services to artificial intelligence - all of which with machine learning (ML) at their core.
With this rising trend, there is growing attention on privacy by consumers and government bodies; this introduces the need for Federated Learning (FL) and Differential Privacy (DP) - an evolved form of ML, where models are trained while privacy is safeguarded - which forms the focus of our research.
We visited existing research developments in privacy-preserving deep learning applications on structured and unstructured data and designed a proof-of-concept platform for the same, in the form of a Convolutional Neural Network for MNIST dataset handwritten digits hosted on the Cloud. Our experiment structure tested the different permutations between the degree of training in models, determined by the number of epochs per generation, and whether DP was implemented.
In particular, our findings indicated the following: 1, adding noise to trained weights resulted in an overall decrease in trained accuracy but greater epsilon value; 2, larger locally trained accuracy for a larger epoch run presented itself with a larger accuracy drop; 3, lower final validation accuracy was achieved for DP models; 4, there was a low correlation between final validation accuracy values with standard deviation regardless of DP model.
Further research can be conducted on the differing FL structures and centrality. Although FL is relatively new, there is strong evidence to suggest a growing interest and attention towards it. We hold the opinion that FL has a place in collaborative ML-based applications while preserving the privacy of end-users. |
author2 |
Lam Kwok Yan |
author_facet |
Lam Kwok Yan Yik, Jia Ler Tan, Zhen Yuan Zaw, Maw Htun |
format |
Final Year Project |
author |
Yik, Jia Ler Tan, Zhen Yuan Zaw, Maw Htun |
author_sort |
Yik, Jia Ler |
title |
Privacy-preserving deep learning |
title_short |
Privacy-preserving deep learning |
title_full |
Privacy-preserving deep learning |
title_fullStr |
Privacy-preserving deep learning |
title_full_unstemmed |
Privacy-preserving deep learning |
title_sort |
privacy-preserving deep learning |
publisher |
Nanyang Technological University |
publishDate |
2021 |
url |
https://hdl.handle.net/10356/151686 |
_version_ |
1705151288753258496 |