MDPs as distribution transformers: Affine invariant synthesis for safety objectives

Markov decision processes can be viewed as transformers of probability distributions. While this view is useful from a practical standpoint to reason about trajectories of distributions, basic reachability and safety problems are known to be computationally intractable (i.e., Skolem-hard) to solve i...

Full description

Saved in:
Bibliographic Details
Main Authors: AKSHAY, S., CHATTERJEE, Krishnendu, MEGGENDORFER, Tobias, ZIKELIC, Dorde
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9068
https://ink.library.smu.edu.sg/context/sis_research/article/10071/viewcontent/Computer_Aided_Verification.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10071
record_format dspace
spelling sg-smu-ink.sis_research-100712024-08-01T15:25:33Z MDPs as distribution transformers: Affine invariant synthesis for safety objectives AKSHAY, S. CHATTERJEE, Krishnendu MEGGENDORFER, Tobias ZIKELIC, Dorde Markov decision processes can be viewed as transformers of probability distributions. While this view is useful from a practical standpoint to reason about trajectories of distributions, basic reachability and safety problems are known to be computationally intractable (i.e., Skolem-hard) to solve in such models. Further, we show that even for simple examples of MDPs, strategies for safety objectives over distributions can require infinite memory and randomization.In light of this, we present a novel overapproximation approach to synthesize strategies in an MDP, such that a safety objective over the distributions is met. More precisely, we develop a new framework for template-based synthesis of certificates as affine distributional and inductive invariants for safety objectives in MDPs. We provide two algorithms within this framework. One can only synthesize memoryless strategies, but has relative completeness guarantees, while the other can synthesize general strategies. The runtime complexity of both algorithms is in PSPACE. We implement these algorithms and show that they can solve several non-trivial examples. 2023-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9068 info:doi/10.1007/978-3-031-37709-9_5 https://ink.library.smu.edu.sg/context/sis_research/article/10071/viewcontent/Computer_Aided_Verification.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Markov decision processes invariant synthesis distribution transformers Skolem hardness Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Markov decision processes
invariant synthesis
distribution transformers
Skolem hardness
Databases and Information Systems
spellingShingle Markov decision processes
invariant synthesis
distribution transformers
Skolem hardness
Databases and Information Systems
AKSHAY, S.
CHATTERJEE, Krishnendu
MEGGENDORFER, Tobias
ZIKELIC, Dorde
MDPs as distribution transformers: Affine invariant synthesis for safety objectives
description Markov decision processes can be viewed as transformers of probability distributions. While this view is useful from a practical standpoint to reason about trajectories of distributions, basic reachability and safety problems are known to be computationally intractable (i.e., Skolem-hard) to solve in such models. Further, we show that even for simple examples of MDPs, strategies for safety objectives over distributions can require infinite memory and randomization.In light of this, we present a novel overapproximation approach to synthesize strategies in an MDP, such that a safety objective over the distributions is met. More precisely, we develop a new framework for template-based synthesis of certificates as affine distributional and inductive invariants for safety objectives in MDPs. We provide two algorithms within this framework. One can only synthesize memoryless strategies, but has relative completeness guarantees, while the other can synthesize general strategies. The runtime complexity of both algorithms is in PSPACE. We implement these algorithms and show that they can solve several non-trivial examples.
format text
author AKSHAY, S.
CHATTERJEE, Krishnendu
MEGGENDORFER, Tobias
ZIKELIC, Dorde
author_facet AKSHAY, S.
CHATTERJEE, Krishnendu
MEGGENDORFER, Tobias
ZIKELIC, Dorde
author_sort AKSHAY, S.
title MDPs as distribution transformers: Affine invariant synthesis for safety objectives
title_short MDPs as distribution transformers: Affine invariant synthesis for safety objectives
title_full MDPs as distribution transformers: Affine invariant synthesis for safety objectives
title_fullStr MDPs as distribution transformers: Affine invariant synthesis for safety objectives
title_full_unstemmed MDPs as distribution transformers: Affine invariant synthesis for safety objectives
title_sort mdps as distribution transformers: affine invariant synthesis for safety objectives
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/9068
https://ink.library.smu.edu.sg/context/sis_research/article/10071/viewcontent/Computer_Aided_Verification.pdf
_version_ 1814047722675634176