Maximum entropy as a feasible way to describe joint distribution in expert systems

© 2017 by the Mathematical Association of Thailand. All rights reserved. In expert systems, we elicit the probabilities of different statements from the experts. However, to adequately use the expert system, we also need to know the probabilities of different propositional combinations of the expert...

Full description

Saved in:
Bibliographic Details
Main Authors: Thongchai Dumrongpokaphan, Vladik Kreinovich, Hung T. Nguyen
Format: Journal
Published: 2018
Subjects:
Online Access:https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85039713278&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/43789
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Chiang Mai University
id th-cmuir.6653943832-43789
record_format dspace
spelling th-cmuir.6653943832-437892018-04-25T07:32:36Z Maximum entropy as a feasible way to describe joint distribution in expert systems Thongchai Dumrongpokaphan Vladik Kreinovich Hung T. Nguyen Mathematics Agricultural and Biological Sciences © 2017 by the Mathematical Association of Thailand. All rights reserved. In expert systems, we elicit the probabilities of different statements from the experts. However, to adequately use the expert system, we also need to know the probabilities of different propositional combinations of the experts’ statements – i.e., we need to know the corresponding joint distribution. The problem is that there are exponentially many such combinations, and it is not practically possible to elicit all their probabilities from the experts. So, we need to estimate this joint distribution based on the available information. For this purpose, many practitioners use heuristic approaches – e.g., the t-norm approach of fuzzy logic. However, this is a particular case of a situation for which the maximum entropy approach has been invented, so why not use the maximum entropy approach? The problem is that in this case, the usual formulation of the maximum entropy approach requires maximizing a function with exponentially many unknowns – a task which is, in general, not practically feasible. In this paper, we show that in many reasonable example, the corresponding maximum entropy problem can be reduced to an equivalent problem with a much smaller (and feasible) number of unknowns – a problem which is, therefore, much easier to solve. 2018-01-24T03:58:28Z 2018-01-24T03:58:28Z 2017-01-01 Journal 16860209 2-s2.0-85039713278 https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85039713278&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/43789
institution Chiang Mai University
building Chiang Mai University Library
country Thailand
collection CMU Intellectual Repository
topic Mathematics
Agricultural and Biological Sciences
spellingShingle Mathematics
Agricultural and Biological Sciences
Thongchai Dumrongpokaphan
Vladik Kreinovich
Hung T. Nguyen
Maximum entropy as a feasible way to describe joint distribution in expert systems
description © 2017 by the Mathematical Association of Thailand. All rights reserved. In expert systems, we elicit the probabilities of different statements from the experts. However, to adequately use the expert system, we also need to know the probabilities of different propositional combinations of the experts’ statements – i.e., we need to know the corresponding joint distribution. The problem is that there are exponentially many such combinations, and it is not practically possible to elicit all their probabilities from the experts. So, we need to estimate this joint distribution based on the available information. For this purpose, many practitioners use heuristic approaches – e.g., the t-norm approach of fuzzy logic. However, this is a particular case of a situation for which the maximum entropy approach has been invented, so why not use the maximum entropy approach? The problem is that in this case, the usual formulation of the maximum entropy approach requires maximizing a function with exponentially many unknowns – a task which is, in general, not practically feasible. In this paper, we show that in many reasonable example, the corresponding maximum entropy problem can be reduced to an equivalent problem with a much smaller (and feasible) number of unknowns – a problem which is, therefore, much easier to solve.
format Journal
author Thongchai Dumrongpokaphan
Vladik Kreinovich
Hung T. Nguyen
author_facet Thongchai Dumrongpokaphan
Vladik Kreinovich
Hung T. Nguyen
author_sort Thongchai Dumrongpokaphan
title Maximum entropy as a feasible way to describe joint distribution in expert systems
title_short Maximum entropy as a feasible way to describe joint distribution in expert systems
title_full Maximum entropy as a feasible way to describe joint distribution in expert systems
title_fullStr Maximum entropy as a feasible way to describe joint distribution in expert systems
title_full_unstemmed Maximum entropy as a feasible way to describe joint distribution in expert systems
title_sort maximum entropy as a feasible way to describe joint distribution in expert systems
publishDate 2018
url https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85039713278&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/43789
_version_ 1681422438576947200