Entropy as a measure of average loss of privacy
© 2017 by the Mathematical Association of Thailand. All rights reserved. Privacy means that not everything about a person is known, that we need to ask additional questions to get the full information about the person. It therefore seems to reasonable to gauge the degree of privacy in each situation...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal |
Published: |
2018
|
Subjects: | |
Online Access: | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85039744616&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/57541 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Chiang Mai University |
Summary: | © 2017 by the Mathematical Association of Thailand. All rights reserved. Privacy means that not everything about a person is known, that we need to ask additional questions to get the full information about the person. It therefore seems to reasonable to gauge the degree of privacy in each situation by the average number of binary (“yes”-“no”) questions that we need to ask to determine the full information – which is exactly Shannon’s entropy. The problem with this idea is that it is possible, by asking two binary questions – and thus, strictly speaking, getting only two bits of information – to sometimes learn a large amount of information. In this paper, we show that while entropy is not always an adequate measure of the absolute loss of privacy, it is a good idea for gauging the average loss of privacy. To properly evaluate different privacy-preserving schemes, so also propose to supplement the average privacy loss with the standard deviation of privacy loss – to see how much the actual privacy loss cab deviate from its average value. |
---|