Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications

It is critical that agents deployed in real-world settings, such as businesses, offices, universities and research laboratories, protect their individual users’ privacy when interacting with other entities. Indeed, privacy is recognized as a key motivating factor in the design of several multiagent...

Full description

Saved in:
Bibliographic Details
Main Authors: MAHESWARAN, Rajiv, Pearce, Jonathan, Bowring, Emma, Varakantham, Pradeep Reddy, Tambe, Milind
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2006
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/22
http://dx.doi.org/10.1007/s10458-006-5951-y
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-1021
record_format dspace
spelling sg-smu-ink.sis_research-10212010-09-22T14:00:36Z Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications MAHESWARAN, Rajiv Pearce, Jonathan Bowring, Emma Varakantham, Pradeep Reddy Tambe, Milind It is critical that agents deployed in real-world settings, such as businesses, offices, universities and research laboratories, protect their individual users’ privacy when interacting with other entities. Indeed, privacy is recognized as a key motivating factor in the design of several multiagent algorithms, such as in distributed constraint reasoning (including both algorithms for distributed constraint optimization (DCOP) and distributed constraint satisfaction (DisCSPs)), and researchers have begun to propose metrics for analysis of privacy loss in such multiagent algorithms. Unfortunately, a general quantitative framework to compare these existing metrics for privacy loss or to identify dimensions along which to construct new metrics is currently lacking. This paper presents three key contributions to address this shortcoming. First, the paper presents VPS (Valuations of Possible States), a general quantitative framework to express, analyze and compare existing metrics of privacy loss. Based on a state-space model, VPS is shown to capture various existing measures of privacy created for specific domains of DisCSPs. The utility of VPS is further illustrated through analysis of privacy loss in DCOP algorithms, when such algorithms are used by personal assistant agents to schedule meetings among users. In addition, VPS helps identify dimensions along which to classify and construct new privacy metrics and it also supports their quantitative comparison. Second, the article presents key inference rules that may be used in analysis of privacy loss in DCOP algorithms under different assumptions. Third, detailed experiments based on the VPS-driven analysis lead to the following key results: (i) decentralization by itself does not provide superior protection of privacy in DisCSP/DCOP algorithms when compared with centralization; instead, privacy protection also requires the presence of uncertainty about agents’ knowledge of the constraint graph. (ii) one needs to carefully examine the metrics chosen to measure privacy loss; the qualitative properties of privacy loss and hence the conclusions that can be drawn about an algorithm can vary widely based on the metric chosen. This paper should thus serve as a call to arms for further privacy research, particularly within the DisCSP/DCOP arena. 2006-01-01T08:00:00Z text https://ink.library.smu.edu.sg/sis_research/22 info:doi/10.1007/s10458-006-5951-y http://dx.doi.org/10.1007/s10458-006-5951-y Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Artificial Intelligence and Robotics Business Operations Research, Systems Engineering and Industrial Engineering
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Artificial Intelligence and Robotics
Business
Operations Research, Systems Engineering and Industrial Engineering
spellingShingle Artificial Intelligence and Robotics
Business
Operations Research, Systems Engineering and Industrial Engineering
MAHESWARAN, Rajiv
Pearce, Jonathan
Bowring, Emma
Varakantham, Pradeep Reddy
Tambe, Milind
Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
description It is critical that agents deployed in real-world settings, such as businesses, offices, universities and research laboratories, protect their individual users’ privacy when interacting with other entities. Indeed, privacy is recognized as a key motivating factor in the design of several multiagent algorithms, such as in distributed constraint reasoning (including both algorithms for distributed constraint optimization (DCOP) and distributed constraint satisfaction (DisCSPs)), and researchers have begun to propose metrics for analysis of privacy loss in such multiagent algorithms. Unfortunately, a general quantitative framework to compare these existing metrics for privacy loss or to identify dimensions along which to construct new metrics is currently lacking. This paper presents three key contributions to address this shortcoming. First, the paper presents VPS (Valuations of Possible States), a general quantitative framework to express, analyze and compare existing metrics of privacy loss. Based on a state-space model, VPS is shown to capture various existing measures of privacy created for specific domains of DisCSPs. The utility of VPS is further illustrated through analysis of privacy loss in DCOP algorithms, when such algorithms are used by personal assistant agents to schedule meetings among users. In addition, VPS helps identify dimensions along which to classify and construct new privacy metrics and it also supports their quantitative comparison. Second, the article presents key inference rules that may be used in analysis of privacy loss in DCOP algorithms under different assumptions. Third, detailed experiments based on the VPS-driven analysis lead to the following key results: (i) decentralization by itself does not provide superior protection of privacy in DisCSP/DCOP algorithms when compared with centralization; instead, privacy protection also requires the presence of uncertainty about agents’ knowledge of the constraint graph. (ii) one needs to carefully examine the metrics chosen to measure privacy loss; the qualitative properties of privacy loss and hence the conclusions that can be drawn about an algorithm can vary widely based on the metric chosen. This paper should thus serve as a call to arms for further privacy research, particularly within the DisCSP/DCOP arena.
format text
author MAHESWARAN, Rajiv
Pearce, Jonathan
Bowring, Emma
Varakantham, Pradeep Reddy
Tambe, Milind
author_facet MAHESWARAN, Rajiv
Pearce, Jonathan
Bowring, Emma
Varakantham, Pradeep Reddy
Tambe, Milind
author_sort MAHESWARAN, Rajiv
title Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
title_short Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
title_full Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
title_fullStr Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
title_full_unstemmed Privacy Loss in Distributed Constraint Reasoning: A Quantitative Framework for Analysis and Its Applications
title_sort privacy loss in distributed constraint reasoning: a quantitative framework for analysis and its applications
publisher Institutional Knowledge at Singapore Management University
publishDate 2006
url https://ink.library.smu.edu.sg/sis_research/22
http://dx.doi.org/10.1007/s10458-006-5951-y
_version_ 1770568851581304832