Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Kybernetika 56 no. 6, 1133-1153, 2020

Tropical probability theory and an application to the entropic cone

Rostislav Matveev and Jacobus W. PortegiesDOI: 10.14736/kyb-2020-6-1133

Abstract:

In a series of articles, we have been developing a theory of \emph{tropical diagrams of probability spaces}, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone.

Keywords:

tropical probability, entropic cone, non-Shannon inequality

Classification:

94A17, 94A24

References:

  1. R. Ahlswede and J. Körner: On common information and related characteristics of correlated information sources. Preprint, 7th Prague Conference on Information Theory, 1974.   CrossRef
  2. R. Ahlswede and J. Körner:     CrossRef
  3. N. Bertschinger, J. Rauh, E. Olbrich, J. Jost and N. Ay: Quantifying unique information. Entropy {\em 16} (2014), 4, 2161-2183.   DOI:10.3390/e16042161
  4. T. H. Chan and R. W Yeung: On a relation between information inequalities and group theory. IEEE Trans. Inform. Theory {\em 48} (2002), 7, 1992-1995.   DOI:10.1109/tit.2002.1013138
  5. R. Dougherty, Ch. Freiling and K. Zeger: Six new non-shannon information inequalities. In: 2006 IEEE International Symposium on Information Theory, IEEE, 2006, pp. 233-236.   DOI:10.1109/isit.2006.261840
  6. R. Dougherty, Ch. Freiling and K. Zeger: Non-Shannon information inequalities in four random variables. arXiv preprint arXiv:1104.3602, 2011.   CrossRef
  7. M. Gromov: In a search for a structure, part 1: On entropy.    CrossRef
  8. M. Kovačević, I. Stanojević and V. Šenk: On the hardness of entropy minimization and related problems. In: 2012 IEEE Information Theory Workshop, IEEE, 2012, pp. 512-516.   DOI:10.3390/e22040407
  9. T. Leinster: Basic Category Theory, volume 143.    CrossRef
  10. F. Matúš: Probabilistic conditional independence structures and matroid theory: background 1. Int. J. General System {\em 22} (1993), 2, 185-196.   DOI:10.1080/03081079308935205
  11. F. Matúš: Two constructions on limits of entropy functions. IEEE Trans. Inform. Theory {\em 53} (2006), 1, 320-330.   DOI:10.1109/tit.2006.887090
  12. F. Matúš: Infinitely many information inequalities. In: IEEE International Symposium on Information Theory, ISIT 2007, IEEE, pp. 41-44.   DOI:10.1109/isit.2007.4557201
  13. F. Matúš and L. Csirmaz: Entropy region and convolution. IEEE Trans. Inform. Theory {\em 62} (2016), 11, 6007-6018.   DOI:10.1109/tit.2016.2601598
  14. F. Matúš and M. Studený: Conditional independences among four random variables i. Combinat. Probab. Comput. {\em 4} (1995), 3, 269-278.   DOI:10.1017/s0963548300001644
  15. K. Makarychev, Y. Makarychev, A. Romashchenko and N. Vereshchagin: A new class of non-{S}hannon-type inequalities for entropies. Comm. Inform. Syst. {\em 2} (2002), 2, 147-166.   DOI:10.4310/cis.2002.v2.n2.a3
  16. R. Matveev and J. W Portegies: Asymptotic dependency structure of multiple signals. Inform. Geometry {\em 1} (2018), 2, 237-285.   DOI:10.1007/s41884-018-0013-5
  17. R. Matveev and J. W. Portegies: Arrow Contraction and Expansion in Tropical Diagrams. arXiv e-prints, page arXiv:1905.05597, 2019.   CrossRef
  18. R. Matveev and J. W. Portegies: Conditioning in tropical probability theory. arXiv e-prints, page arXiv:1905.05596, 2019.   CrossRef
  19. R. Matveev and J. W. Portegies: Tropical diagrams of probability spaces. arXiv e-prints, page arXiv:1905.04375, 2019.   CrossRef
  20. D. Slepian and J. Wolf: Noiseless coding of correlated information sources. IEEE Trans. Inform. Theory {\em 19} (1973), 4, 471-480.   DOI:10.1109/tit.1973.1055037
  21. M. Vidyasagar: A metric between probability distributions on finite sets of different cardinalities and applications to order reduction. IEEE Trans. Automat. Control {\em 57} (2012), 10, 2464-2477.   DOI:10.1109/tac.2012.2188423
  22. A. Wyner: The common information of two dependent random variables. IEEE Trans. Inform. Theory {\em 21} (1975), 2, 163-179.   DOI:10.1109/tit.1975.1055346
  23. R. W. Yeung: Information Theory and Network Coding. Springer Science and Business Media, 2008.   DOI:10.1007/978-0-387-79234-7\_1
  24. Z. Zhang and R. W. Yeung: A non-shannon-type conditional inequality of information quantities. IEEE Trans. Inform. Theory {\em 43} (1997), 6, 1982-1986.   DOI:10.1109/18.641561
  25. Z. Zhang and R. W. Yeung: On characterization of entropy function via information inequalities. IEEE Trans. Inform. Theory {\em 44} (1998), 4, 1440-1452.   DOI:10.1109/18.681320