Journal Article
,
: International Journal of Approximate Reasoning vol.184, 109454
: Belief functions, Entropy, Mutual information, Divergence
: https://library.utia.cas.cz/separaty/2025/MTR/kratochvil-0635531-preprint.pdf
: https://www.sciencedirect.com/science/article/pii/S0888613X25000957?via%3Dihub
(eng): This paper addresses the long-standing challenge of identifying belief function entropies that can effectively guide model learning within the Dempster-Shafer theory of evidence. Building on the analogy with classical probabilistic approaches, we examine 25 entropy functions documented in the literature and evaluate their potential to define mutual information in the belief function framework. As conceptualized in probability theory, mutual information requires strictly subadditive entropies, which are inversely related to the informativeness of belief functions. After extensive analysis, we have found that none of the studied entropy functions fully satisfy these criteria. Nevertheless, certain entropy functions exhibit properties that may make them useful for heuristic model learning algorithms. This paper provides a detailed comparative study of these functions, explores alternative approaches using divergence-based measures, and offers insights into the design of information-theoretic tools for belief function models.
: BA
: 10103