Privacy-preserving decentralized detection in sensor networks
With a burgeoning number of Internet of Things (IoT) devices penetrating into all aspects of our lives, privacy-related issues are attracting increasing interest. Sensor network, one of the most important enabling technology of IoT, should be designed with privacy considerations. By giving users mor...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/105947 http://hdl.handle.net/10220/48831 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-105947 |
---|---|
record_format |
dspace |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Electrical and electronic engineering::Wireless communication systems |
spellingShingle |
DRNTU::Engineering::Electrical and electronic engineering::Wireless communication systems Sun, Meng Privacy-preserving decentralized detection in sensor networks |
description |
With a burgeoning number of Internet of Things (IoT) devices penetrating into all aspects of our lives, privacy-related issues are attracting increasing interest. Sensor network, one of the most important enabling technology of IoT, should be designed with privacy considerations. By giving users more control over what information can be shared from the sensors to the service providers, we will encourage the adoption of IoT technologies and put users at ease.
We model the sensor network with decentralized detection framework, where each sensor make a local decision based on its observation of hypotheses, and transmit the local decision to the fusion center. The fusion center make inferences
based on the received sensor decisions. In this thesis, we aim to find privacy mapping at each sensor to distort sensor observations before send to the fusion center, such that privacy is protected, while still enabling the fusion center to
make accurate detection of the public hypothesis.
Firstly, we consider protecting information privacy of private hypothesis without assuming knowledge of joint distribution of the sensor observations and hypotheses. In a sensor network, multiple sensors send information to a fusion center for it to infer a public hypothesis of interest. However, the same sensor information may be used by the fusion center to make inferences of a private nature that the sensors wish to protect. Without assuming knowledge of the joint distribution of the sensor observations and hypotheses, we adopt a nonparametric learning approach to design local privacy mappings. We introduce the concept of an empirical normalized risk, which provides a theoretical guarantee for the network to achieve information privacy for the private hypothesis with high probability when the number of training samples is large. We develop iterative optimization algorithms to determine an appropriate privacy threshold and the best sensor privacy mappings, and show that they converge. Finally, we extend our approach to the case of a private multiple hypothesis.
Secondly, we consider protecting information privacy of a set of private hypotheses with known joint distribution of the sensor observations and hypotheses. We consider the fact that privacy concern is usually not about a single hypothesis,
small deviation from a nominal private hypothesis should also be kept private from the fusion center. We find a representative private hypothesis, which is the easiest to detect among the set of private hypotheses. We propose an algorithm, by protecting the information privacy of the representative private hypothesis, the information privacy of the set of private hypotheses is protected. We consider the two cases where the number of sensors is nite and in nite respectively.
Finally, we discuss the relationship between various privacy metrics proposed in the literature. We divide the privacy metrics into inference privacy and data privacy metrics. Here, data privacy refers to the concealment of sensors' raw observation from the fusion center, while reducing the disclosure of private states of the object to the fusion center is inference privacy. We show that inference and data privacy are in general not equivalent. We propose methods to protect both inference and data privacy in decentralized detection, by incorporating local differential privacy (data privacy) and information privacy (inference privacy) metrics. We consider both case with and without the known prior knowledge of the sensor observations' distribution. |
author2 |
Tay Wee Peng |
author_facet |
Tay Wee Peng Sun, Meng |
format |
Theses and Dissertations |
author |
Sun, Meng |
author_sort |
Sun, Meng |
title |
Privacy-preserving decentralized detection in sensor networks |
title_short |
Privacy-preserving decentralized detection in sensor networks |
title_full |
Privacy-preserving decentralized detection in sensor networks |
title_fullStr |
Privacy-preserving decentralized detection in sensor networks |
title_full_unstemmed |
Privacy-preserving decentralized detection in sensor networks |
title_sort |
privacy-preserving decentralized detection in sensor networks |
publishDate |
2019 |
url |
https://hdl.handle.net/10356/105947 http://hdl.handle.net/10220/48831 |
_version_ |
1772827554093203456 |
spelling |
sg-ntu-dr.10356-1059472023-07-04T16:31:12Z Privacy-preserving decentralized detection in sensor networks Sun, Meng Tay Wee Peng School of Electrical and Electronic Engineering Information Communication Institute of Singapore DRNTU::Engineering::Electrical and electronic engineering::Wireless communication systems With a burgeoning number of Internet of Things (IoT) devices penetrating into all aspects of our lives, privacy-related issues are attracting increasing interest. Sensor network, one of the most important enabling technology of IoT, should be designed with privacy considerations. By giving users more control over what information can be shared from the sensors to the service providers, we will encourage the adoption of IoT technologies and put users at ease. We model the sensor network with decentralized detection framework, where each sensor make a local decision based on its observation of hypotheses, and transmit the local decision to the fusion center. The fusion center make inferences based on the received sensor decisions. In this thesis, we aim to find privacy mapping at each sensor to distort sensor observations before send to the fusion center, such that privacy is protected, while still enabling the fusion center to make accurate detection of the public hypothesis. Firstly, we consider protecting information privacy of private hypothesis without assuming knowledge of joint distribution of the sensor observations and hypotheses. In a sensor network, multiple sensors send information to a fusion center for it to infer a public hypothesis of interest. However, the same sensor information may be used by the fusion center to make inferences of a private nature that the sensors wish to protect. Without assuming knowledge of the joint distribution of the sensor observations and hypotheses, we adopt a nonparametric learning approach to design local privacy mappings. We introduce the concept of an empirical normalized risk, which provides a theoretical guarantee for the network to achieve information privacy for the private hypothesis with high probability when the number of training samples is large. We develop iterative optimization algorithms to determine an appropriate privacy threshold and the best sensor privacy mappings, and show that they converge. Finally, we extend our approach to the case of a private multiple hypothesis. Secondly, we consider protecting information privacy of a set of private hypotheses with known joint distribution of the sensor observations and hypotheses. We consider the fact that privacy concern is usually not about a single hypothesis, small deviation from a nominal private hypothesis should also be kept private from the fusion center. We find a representative private hypothesis, which is the easiest to detect among the set of private hypotheses. We propose an algorithm, by protecting the information privacy of the representative private hypothesis, the information privacy of the set of private hypotheses is protected. We consider the two cases where the number of sensors is nite and in nite respectively. Finally, we discuss the relationship between various privacy metrics proposed in the literature. We divide the privacy metrics into inference privacy and data privacy metrics. Here, data privacy refers to the concealment of sensors' raw observation from the fusion center, while reducing the disclosure of private states of the object to the fusion center is inference privacy. We show that inference and data privacy are in general not equivalent. We propose methods to protect both inference and data privacy in decentralized detection, by incorporating local differential privacy (data privacy) and information privacy (inference privacy) metrics. We consider both case with and without the known prior knowledge of the sensor observations' distribution. Doctor of Philosophy 2019-06-19T05:33:28Z 2019-12-06T22:01:19Z 2019-06-19T05:33:28Z 2019-12-06T22:01:19Z 2019 Thesis Sun, M. (2019). Privacy-preserving decentralized detection in sensor networks. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/105947 http://hdl.handle.net/10220/48831 10.32657/10220/48831 en 153 p. application/pdf |