Abstract
Many side-channel distinguishers (such as DPA/DoM, CPA, Euclidean Distance, KSA, MIA, etc.) have been devised and studied to extract keys from cryptographic devices. Each has pros and cons and find applications in various contexts. These distinguishers have been described theoretically in order to determine which distinguisher is best for a given context, enabling an unambiguous characterization in terms of success rate or number of traces required to extract the secret key.
In this paper, we show that in the case of monobit leakages, the theoretical expression of all distinguishers depend only on two parameters: the confusion coefficient and the signal-to-noise ratio. We provide closed-form expressions and leverage them to compare the distinguishers in terms of convergence speed for distinguishing between key candidates. This study contrasts with previous works where only the asymptotic behavior was determined—when the number of traces tends to infinity, or when the signal-to-noise ratio tends to zero.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
In [10], CPA is treated as a distinguisher, but without the absolute values. Those remove false positives which occur in monobit leakages when there are anti-correlations. Our value of the success exponent is, therefore, different from theirs.
References
Batina, L., Robshaw, M. (eds.): CHES 2014. LNCS, vol. 8731. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44709-3
Blahut, R.E.: Principles and Practice of Information Theory. Addison-Wesley Longman Publishing Co. Inc., Boston (1987)
Brier, É., Clavier, C., Olivier, F.: Correlation power analysis with a leakage model. In: Joye, M., Quisquater, J.-J. (eds.) CHES 2004. LNCS, vol. 3156, pp. 16–29. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28632-5_2
Carlet, C., Heuser, A., Picek, S.: Trade-offs for S-boxes: cryptographic properties and side-channel resilience. In: Gollmann, D., Miyaji, A., Kikuchi, H. (eds.) ACNS 2017. LNCS, vol. 10355, pp. 393–414. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61204-1_20
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley-Interscience, New York (2006). ISBN-10: 0471241954, ISBN-13: 978-0471241959
Daemen, J., Rijmen, V.: Rijndael for AES. In: AES Candidate Conference, pp. 343–348 (2000)
Fei, Y., Ding, A.A., Lao, J., Zhang, L.: A statistics-based success rate model for DPA and CPA. J. Cryptographic Eng. 5(4), 227–243 (2015). https://doi.org/10.1007/s13389-015-0107-0
Fei, Y., Luo, Q., Ding, A.A.: A statistical model for DPA with novel algorithmic confusion analysis. In: Prouff, E., Schaumont, P. (eds.) CHES 2012. LNCS, vol. 7428, pp. 233–250. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33027-8_14
Gierlichs, B., Batina, L., Tuyls, P., Preneel, B.: Mutual information analysis. In: Oswald, E., Rohatgi, P. (eds.) CHES 2008. LNCS, vol. 5154, pp. 426–442. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-85053-3_27
Guilley, S., Heuser, A., Rioul, O.: A key to success. In: Biryukov, A., Goyal, V. (eds.) INDOCRYPT 2015. LNCS, vol. 9462, pp. 270–290. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-26617-6_15
Heuser, A., Rioul, O., Guilley, S.: A theoretical study of Kolmogorov-Smirnov distinguishers – side-channel analysis vs. differential cryptanalysis. In: Prouff [17], pp. 9–28. https://doi.org/10.1007/978-3-319-10175-0_2
Heuser, A., Rioul, O., Guilley, S.: Good is not good enough - deriving optimal distinguishers from communication theory. In: Batina and Robshaw [1], pp. 55–74. https://doi.org/10.1007/978-3-662-44709-3_4
Kocher, P., Jaffe, J., Jun, B.: Differential power analysis. In: Wiener, M. (ed.) CRYPTO 1999. LNCS, vol. 1666, pp. 388–397. Springer, Heidelberg (1999). https://doi.org/10.1007/3-540-48405-1_25
Lomné, V., Prouff, E., Rivain, M., Roche, T., Thillard, A.: How to estimate the success rate of higher-order side-channel attacks. In: Batina and Robshaw [1], pp. 35–54. https://doi.org/10.1007/978-3-662-44709-3_3
Mangard, S., Oswald, E., Popp, T.: Power Analysis Attacks. Revealing the Secrets of Smart Cards. Springer, Boston (2007). https://doi.org/10.1007/978-0-387-38162-6
Mangard, S., Oswald, E., Standaert, F.: One for all - all for one: unifying standard differential power analysis attacks. IET Inf. Secur. 5(2), 100–110 (2011). https://doi.org/10.1049/iet-ifs.2010.0096
Prouff, E. (ed.): COSADE 2014. LNCS, vol. 8622. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10175-0
Reparaz, O., Gierlichs, B., Verbauwhede, I.: A note on the use of margins to compare distinguishers. In: Prouff [17], pp. 1–8. https://doi.org/10.1007/978-3-319-10175-0_1
Rivain, M.: On the exact success rate of side channel analysis in the Gaussian model. In: Avanzi, R.M., Keliher, L., Sica, F. (eds.) SAC 2008. LNCS, vol. 5381, pp. 165–183. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-04159-4_11
Schindler, W., Lemke, K., Paar, C.: A stochastic model for differential side channel cryptanalysis. In: Rao, J.R., Sunar, B. (eds.) CHES 2005. LNCS, vol. 3659, pp. 30–46. Springer, Heidelberg (2005). https://doi.org/10.1007/11545262_3
Whitnall, C., Oswald, E.: A fair evaluation framework for comparing side-channel distinguishers. J. Cryptographic Eng. 1(2), 145–160 (2011)
Whitnall, C., Oswald, E., Mather, L.: An exploration of the Kolmogorov-Smirnov test as a competitor to mutual information analysis. In: Prouff, E. (ed.) CARDIS 2011. LNCS, vol. 7079, pp. 234–251. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-27257-8_15
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
A Proof of Lemma 3
The MIA distinguisher is expressed as
From Sect. 3.1, \(Y(k^*)\) knowing Y(k) is a binary random variable with probability \(\kappa (k)\). As N is Gaussian independent from Y(k), the pdf of \(Y(k^*) + N\) knowing Y(k) is a Gaussian mixture that can take two forms:
By symmetry, their entropy \(h(Y(k^*) + N \mid Y(k))\) will be the same and we can take any of these pdfs. Letting \(\phi \) be the standard normal density, we can write
where
For notational convenience define \(\epsilon = 2(1/2 - \kappa (k))\), \(p=p_{1/2}(x)\), and \(t=\tanh (x)\). Then
The first term vanishes since p is even and t odd. We apply a Taylor expansion:
The odd terms of the expansion are null as t is odd and p even. We therefore obtain:
Thus, finally,
where
There are several ways to express \(g(\sigma )\). For example, we have:
This expression can be reduced to:
where \(X\sim \mathcal {N}(0,1)\). By the dominated convergence theorem (\(\tanh ^2(\frac{X}{\sigma }+ \frac{1}{\sigma ^2})\) is always smaller than 1) when \(\sigma \rightarrow 0\), we obtain \(g(0)=1\) and when \(\sigma \rightarrow \infty \) we obtain the equivalent \(\frac{1}{\sigma ^2}\).
B Proof of Lemma 4
The success exponent is defined by
where in our case
First for large q we can consider that \(\mathbb {E}[|\sum _i X_iY_i(k)|] = |\mathbb {E}[\sum _i X_iY_i(k)]|\).
hence
Secondly we have
To remove the absolute values, we distinguish two cases whether the sum is positive or negative. We consider that q is large enough to have strictly positive or negative values.
The variance term is the difference of the two following quantities
Combining all the above expressions we obtain (33).
C Proof of Lemma 5
To prove the success rate of KSA, we first need an estimator for the cumulative density function. We take as kernel a function \(\varPhi \) as simple as possible i.e. the Heaviside function \(\varPhi (x)=0\) if \(x<0\) and \(\varPhi (x)=1\) if \(x\ge 0\).
With this function and for \(x\in \mathbb {R}\), we can estimate \(F(x|Y(k) = 1) - F(x)\) by the following estimator:
We suppose that q is large enough to consider that \(\sum _{i|Y_i(k)=1}1 = \frac{q}{2}\) (by the law of large numbers). Therefore we have:
We notice that \(\sum _{i|Y_i(k)=1}\varPhi (x-X_i) = \frac{1}{2}\sum _i (Y_i(k)+1)\varPhi (x-X_i)\). Therefore
This estimator is a sum of i.i.d. random variables. We can therefore apply the central limit theorem.
The maximum of the absolute value is for \(x=0\) and we obtain:
We notice that \(\Vert \mathbb {E}[ \tilde{F}(x |Y(k) = 1) - \tilde{F}(x) ] \Vert _{\infty } = \Vert \mathbb {E}[ \tilde{F}(x |Y(k) = -1) - \tilde{F}(x) ] \Vert _{\infty } \). To calculate the variance, we consider that \(x=0\) as it is the value that maximizes the expectation of the distinguisher.
The computation of this variance gives:
Overall, the success exponent is:
D Proof of Lemma 6
For MIA, we refer to [10, Section 5.3] for the theoretical justifications. In order to obtain a simple closed-form expression of the success exponent, we suppose that \(\sigma \gg 1\) and that the probability density functions are all Gaussian. This means that X|Y(k) is a Gaussian random variable of standard deviation \(\sqrt{4\kappa (k)(1-\kappa (k)) + \sigma ^2}\). Moreover, we will keep only the first order approximation in \(\mathsf {SNR}=\sigma ^{-2}\) of the SE.
The Fisher information of a Gaussian random variable of standard deviation \(\zeta \) is equal to \(\frac{1}{\zeta ^2}\). Therefore the Fisher information of X knowing \(Y=y(k)\) is:
As this value does not depend on the value of Y(k), we have:
Last, we have to calculate \(\mathrm {Var}(-\log _2 p(X|Y(k) = y(k)))\). Let \(\zeta ^2 = \sigma ^2 + 4\kappa (k)(1-\kappa (k))\) and C the normalization constant. We have:
Overall, the success exponent defined in [10, Proposition 6] can be simplified in the case of monobit leakage as:
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
de Chérisey, E., Guilley, S., Rioul, O. (2019). Confused yet Successful: . In: Guo, F., Huang, X., Yung, M. (eds) Information Security and Cryptology. Inscrypt 2018. Lecture Notes in Computer Science(), vol 11449. Springer, Cham. https://doi.org/10.1007/978-3-030-14234-6_28
Download citation
DOI: https://doi.org/10.1007/978-3-030-14234-6_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-14233-9
Online ISBN: 978-3-030-14234-6
eBook Packages: Computer ScienceComputer Science (R0)