Abstract
This work explores connections between core-concavity and gain functions, two alternative approaches that emerged in the quantitative information flow community to provide a general framework to study information leakage. In particular (1) we revisit “Axioms for Information Leakage” by replacing averaging with \(\eta \)-averaging and convexity with core-concavity. An interesting consequence of these changes is that the revised axioms capture all Rényi entropies, including the ones not captured by the original formulation of the axioms. (2) We provide an alternative proof for the Coriaceous Theorem based on core-concavity. The general approach of this work is more information theoretical in nature than the work based on gain functions and provides an alternative foundational view of quantitative information flow, rooted on the essential properties of entropy as a measure of uncertainty.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
This is also known in the literature as “conditioning reduces entropy” (CRE).
- 2.
- 3.
Notice that, as F is continuous over a compact set (and thus, bounded), \(\lim _{\epsilon \rightarrow 0^+} G_F(\epsilon \mathbf {u})=0=G_F(0,\dots ,0)\), for all \(\mathbf {u}\in \mathbb {R}^n_{\ge 0}\). Hence, \(G_F\) is continuous.
References
Alvim, M.S., Chatzikokolakis, K., McIver, A., Morgan, C., Palamidessi, C., Smith, G.: Axioms for information leakage. In: Proceedings of CSF, pp. 77–92 (2016). https://doi.org/10.1109/CSF.2016.13
Alvim, M.S., Chatzikokolakis, K., Palamidessi, C., Smith, G.: Measuring information leakage using generalized gain functions. In: 2012 IEEE 25th Computer Security Foundations Symposium, pp. 265–279 (2012). https://doi.org/10.1109/CSF.2012.26
Arimoto, S.: Information measures and capacity of order \(\alpha \) for discrete memoryless channels. Top. Inf. Theory (1977)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)
Dahl, G.: Matrix majorization. Linear Algebra Appl. 288, 53–73 (1999). https://doi.org/10.1016/S0024-3795(98)10175-1. http://www.sciencedirect.com/science/article/pii/S0024379598101751
Havrda, J., Charvát, F.: Quantification method of classification processes: concept of structural \(\alpha \)-entropy. Kybernetika 3(1), 30–35 (1967)
Hayashi, M.: Exponential decreasing rate of leaked information in universal random privacy amplification. IEEE Trans. Inf. Theory 57(6), 3989–4001 (2011)
Iwamoto, M., Shikata, J.: Information theoretic security for encryption based on conditional Rényi entropies. In: Padró, C. (ed.) Information Theoretic Security, pp. 103–121. Springer, Cham (2014)
Khouzani, M.H.R., Malacaria, P.: Relative perfect secrecy: universally optimal strategies and channel design. In: IEEE 29th Computer Security Foundations Symposium, CSF 2016, Lisbon, Portugal, 27 June–1 July 2016, pp. 61–76 (2016). https://doi.org/10.1109/CSF.2016.12
Marshall, A.W., Olkin, I., Arnold, B.C.: Inequalities: Theory of Majorization and Its Applications. Mathematics in Science and Engineering, vol. 143. Academic Press, Cambridge (1979)
Massey: Guessing and entropy. In: Proceedings of the IEEE International Symposium on Information Theory, p. 204. IEEE (1994)
McIver, A., Morgan, C., Smith, G., Espinoza, B., Meinicke, L.: Abstract channels and their robust information-leakage ordering. In: Abadi, M., Kremer, S. (eds.) POST 2014. LNCS, vol. 8414, pp. 83–102. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-54792-8_5
Rényi, A.: On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics, and Probability, pp. 547–561 (1961)
Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 625–656 (1948)
Sharma, B.D., Mittal, D.P.: New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. (Soc. Math. Sci. Delhi India) 10, 28–40 (1975)
Smith, G.: On the foundations of quantitative information flow. In: de Alfaro, L. (ed.) FoSSaCS 2009. LNCS, vol. 5504, pp. 288–302. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-00596-1_21
Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52(1–2), 479–487 (1988)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Américo, A., Khouzani, M.H.R., Malacaria, P. (2019). Core-concavity, Gain Functions and Axioms for Information Leakage. In: Alvim, M., Chatzikokolakis, K., Olarte, C., Valencia, F. (eds) The Art of Modelling Computational Systems: A Journey from Logic and Concurrency to Security and Privacy. Lecture Notes in Computer Science(), vol 11760. Springer, Cham. https://doi.org/10.1007/978-3-030-31175-9_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-31175-9_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-31174-2
Online ISBN: 978-3-030-31175-9
eBook Packages: Computer ScienceComputer Science (R0)