Towards an Axiomatization for the Generalization of the Kullback-Leibler Divergence to Belief Functions
- DOI
- 10.2991/eusflat.2011.28How to use a DOI?
- Keywords
- Dempster-Shafer theory of belief functions, channel capacity, Kullback-Leibler divergence
- Abstract
In his information theory, Shannon [1] defined a notion of uncertainty, the entropy, which has been generalized in several wways to belief functions [2]. He also defined the channel capacity for which we propose in this paper the first generalization to belief functions. To do that, we need first to generalize the Kullback-Leibler (KL) divergence, for which the present work proposes some axioms. Their list is still not exhaustive since the proposed solution is not unique. But there are many practical interests, since the notion of channel capacity is useful to characterize and optimize for example systems of sensors; its generalization to belief functions allows us to include imprecise sensors such as the human. Finally we show an example of gradient algorithm to compute the generalized channel capacity.
- Copyright
- © 2011, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Hélène Soubaras PY - 2011/08 DA - 2011/08 TI - Towards an Axiomatization for the Generalization of the Kullback-Leibler Divergence to Belief Functions BT - Proceedings of the 7th conference of the European Society for Fuzzy Logic and Technology (EUSFLAT-11) PB - Atlantis Press SP - 1090 EP - 1097 SN - 1951-6851 UR - https://doi.org/10.2991/eusflat.2011.28 DO - 10.2991/eusflat.2011.28 ID - Soubaras2011/08 ER -