Privacy-preserving decentralized detection in sensor networks
With a burgeoning number of Internet of Things (IoT) devices penetrating into all aspects of our lives, privacy-related issues are attracting increasing interest. Sensor network, one of the most important enabling technology of IoT, should be designed with privacy considerations. By giving users mor...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/105947 http://hdl.handle.net/10220/48831 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | With a burgeoning number of Internet of Things (IoT) devices penetrating into all aspects of our lives, privacy-related issues are attracting increasing interest. Sensor network, one of the most important enabling technology of IoT, should be designed with privacy considerations. By giving users more control over what information can be shared from the sensors to the service providers, we will encourage the adoption of IoT technologies and put users at ease.
We model the sensor network with decentralized detection framework, where each sensor make a local decision based on its observation of hypotheses, and transmit the local decision to the fusion center. The fusion center make inferences
based on the received sensor decisions. In this thesis, we aim to find privacy mapping at each sensor to distort sensor observations before send to the fusion center, such that privacy is protected, while still enabling the fusion center to
make accurate detection of the public hypothesis.
Firstly, we consider protecting information privacy of private hypothesis without assuming knowledge of joint distribution of the sensor observations and hypotheses. In a sensor network, multiple sensors send information to a fusion center for it to infer a public hypothesis of interest. However, the same sensor information may be used by the fusion center to make inferences of a private nature that the sensors wish to protect. Without assuming knowledge of the joint distribution of the sensor observations and hypotheses, we adopt a nonparametric learning approach to design local privacy mappings. We introduce the concept of an empirical normalized risk, which provides a theoretical guarantee for the network to achieve information privacy for the private hypothesis with high probability when the number of training samples is large. We develop iterative optimization algorithms to determine an appropriate privacy threshold and the best sensor privacy mappings, and show that they converge. Finally, we extend our approach to the case of a private multiple hypothesis.
Secondly, we consider protecting information privacy of a set of private hypotheses with known joint distribution of the sensor observations and hypotheses. We consider the fact that privacy concern is usually not about a single hypothesis,
small deviation from a nominal private hypothesis should also be kept private from the fusion center. We find a representative private hypothesis, which is the easiest to detect among the set of private hypotheses. We propose an algorithm, by protecting the information privacy of the representative private hypothesis, the information privacy of the set of private hypotheses is protected. We consider the two cases where the number of sensors is nite and in nite respectively.
Finally, we discuss the relationship between various privacy metrics proposed in the literature. We divide the privacy metrics into inference privacy and data privacy metrics. Here, data privacy refers to the concealment of sensors' raw observation from the fusion center, while reducing the disclosure of private states of the object to the fusion center is inference privacy. We show that inference and data privacy are in general not equivalent. We propose methods to protect both inference and data privacy in decentralized detection, by incorporating local differential privacy (data privacy) and information privacy (inference privacy) metrics. We consider both case with and without the known prior knowledge of the sensor observations' distribution. |
---|