Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

On Location Hiding in Distributed Systems

  • Conference paper
  • First Online:
Structural Information and Communication Complexity (SIROCCO 2017)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10641))

  • 349 Accesses

Abstract

We consider the following problem – a group of mobile agents perform some task on a terrain modeled as a graph. In a given moment of time an adversary gets access to the graph and agents’ positions. Shortly before adversary’s observation the devices have a chance to relocate themselves in order to hide their initial configuration, as the initial configuration may possibly reveal to the adversary some information about the task they performed. Clearly agents have to change their locations in possibly short time using minimal energy. In our paper we introduce a definition of a well hiding algorithm in which the starting and final configurations of the agents have small mutual information. Then we discuss the influence of various features of the model on running time of the optimal well hiding algorithm. We show that if the topology of the graph is known to the agents, then the number of steps proportional to the diameter of the graph is sufficient and necessary. In the unknown topology scenario we only consider a single agent case. We first show that the task is impossible in the deterministic case if the agent has no memory. Then we present a polynomial randomized algorithm. Finally in the model with memory we show that the number of steps proportional to the number of edges of the graph is sufficient and necessary. In some sense we investigate how complex is the problem of “losing” information about location (both physical and logical) for different settings.

The work of the second author was supported by Polish National Science Center grant 2013/09/B/ST6/02258. The work of the third author was supported by Polish National Science Center grant 2015/17/B/ST6/01897.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    That is, we consider the worst case scenario implying strongest security guarantees.

References

  1. Aleliunas, R., Karp, R.M., Lipton, R.J., Lovász, L., Rackoff, C.: Random walks, universal traversal sequences, and the complexity of maze problems. In: FOCS, pp. 218–223 (1979)

    Google Scholar 

  2. Alon, N., Avin, C., Koucký, M., Kozma, G., Lotker, Z., Tuttle, M.R.: Many random walks are faster than one. Comb. Probab. Comput. 20(4), 481–502 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  3. Boyd, S.P., Diaconis, P., Xiao, L.: Fastest mixing Markov chain on a graph. SIAM Rev. 46(4), 667–689 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  4. Cover, T.M.: Which processes satisfy the second law. In: Halliwell, J.J., Pérez-Mercader, J., Zurek, W.H. (eds.) PhysicaL Origins of Time Asymmetry, pp. 98–107 (1994)

    Google Scholar 

  5. Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, Hoboken (2006)

    MATH  Google Scholar 

  6. Díaz, C.: Anonymity metrics revisited. In: Anonymous Communication and Its Applications, 09–14 October 2005 (2005)

    Google Scholar 

  7. Dragomir, S., Scholz, M., Sunde, J.: Some upper bounds for relative entropy and applications. Comput. Math. Appl. 39(9), 91–100 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  8. Efremenko, K., Reingold, O.: How well do random walks parallelize? In: Dinur, I., Jansen, K., Naor, J., Rolim, J. (eds.) APPROX/RANDOM - 2009. LNCS, vol. 5687, pp. 476–489. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-03685-9_36

    Chapter  Google Scholar 

  9. Elsässer, R., Sauerwald, T.: Tight bounds for the cover time of multiple random walks. Theor. Comput. Sci. 412(24), 2623–2641 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Feige, U.: A tight lower bound on the cover time for random walks on graphs. Random Struct. Algorithms 6(4), 433–438 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  11. Feige, U.: A tight upper bound on the cover time for random walks on graphs. Random Struct. Algorithms 6(1), 51–54 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  12. Gao, K., Zhu, Y., Gong, S., Tan, H.: Location privacy protection algorithm for mobile networks. EURASIP J. Wirel. Commun. Netw. 2016(1), 205 (2016)

    Article  Google Scholar 

  13. Levin, D.A., Peres, Y., Wilmer, E.L.: Markov Chains and Mixing Times. AMS, Providence (2009)

    MATH  Google Scholar 

  14. Li, M., Zhu, H., Gao, Z., Chen, S., Yu, L., Hu, S., Ren, K.: All your location are belong to us: breaking mobile social networks for automated user location tracking. In: MobiHoc, pp. 43–52. ACM, New York (2014)

    Google Scholar 

  15. Lovász, L.: Random walks on graphs: a survey. Comb. Paul Erdos Eighty 2(1), 1–46 (1993)

    Google Scholar 

  16. Niroj Kumar Pani, B.K.R., Mishra, S.: A topology-hiding secure on-demand routing protocol for wireless ad hoc network. Int. J. Comput. Appl. 144(4), 42–50 (2016)

    Google Scholar 

  17. Nonaka, Y., Ono, H., Sadakane, K., Yamashita, M.: The hitting and cover times of metropolis walks. Theor. Comput. Sci. 411(16–18), 1889–1894 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  18. Pfitzmann, A., Köhntopp, M.: Anonymity, unobservability, and pseudonymity — a proposal for terminology. In: Federrath, H. (ed.) Designing Privacy Enhancing Technologies. LNCS, vol. 2009, pp. 1–9. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44702-4_1

    Chapter  Google Scholar 

  19. Pham, A., Huguenin, K., Bilogrevic, I., Hubaux, J.P.: Secure and private proofs for location-based activity summaries in urban areas. In: UbiComp, pp. 751–762. ACM, New York (2014)

    Google Scholar 

  20. Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes 3rd Edition: The Art of Scientific Computing, 3rd edn. Cambridge University Press, New York (2007)

    MATH  Google Scholar 

  21. Reingold, O.: Undirected connectivity in log-space. J. ACM 55(4), 17 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  22. Rényi, A.: On measures of entropy and information. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561. University of California Press (1961)

    Google Scholar 

  23. Sun, J., Boyd, S.P., Xiao, L., Diaconis, P.: The fastest mixing markov process on a graph and a connection to a maximum variance unfolding problem. SIAM Rev. 48(4), 681–699 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  24. Zhang, Y., Yan, T., Tian, J., Hu, Q., Wang, G., Li, Z.: TOHIP: a topology-hiding multipath routing protocol in mobile ad hoc networks. Ad Hoc Netw. 21, 109–122 (2014)

    Article  Google Scholar 

Download references

Acknowledgments

The authors of this paper would like to thank to anonymous reviewers for their valuable comments, suggestions and remarks.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karol Gotfryd .

Editor information

Editors and Affiliations

Appendices

Appendix 1: Information Theory

We recall some basic definitions and facts from Information Theory that can be found e.g. in [5]. In all cases below \(\log \) will denote the base-2 logarithm.

Definition 3

(Entropy). For a discrete random variable \(X :\mathcal {X}\rightarrow \mathbb {R}\) the entropy of X is defined as \(H\left( X\right) = -\sum _{x \in \mathcal {X}} \Pr [X=x] \log \Pr [X=x]\).

Definition 4

(Conditional entropy). If \(X :\mathcal {X}\rightarrow \mathbb {R}\) and \(Y :\mathcal {Y}\rightarrow \mathbb {R}\) are two discrete random variables, we define the conditional entropy as

$$ H\left( X|Y\right) = -\sum _{y \in \mathcal {Y}} \Pr [Y=y] \sum _{x \in \mathcal {X}} \Pr [X=x|Y=y] \log \Pr [X=x|Y=y]. $$

Fact 1

For any random variables X and Y \(H\left( X|Y\right) \le H\left( X\right) \) and the equality holds if and only if X and Y are independent.

Definition 5

(Relative entropy). Let X and Y be two discrete random variables defined on the common space \(\mathcal {X}\) with pmf p(x) and q(x), respectively. The relative entropy (Kullback-Leibler distance) between p(x) and q(x) is

$$ D\left( p||q\right) = \sum _{x \in \mathcal {X}} p(x) \log \frac{p(x)}{q(x)}. $$

Fact 2

(Information inequality). Let p(x) and q(x) be probability mass functions of two discrete random variables \(X, Y :\mathcal {X}\rightarrow \mathbb {R}\). Then \(D\left( p||q\right) \ge 0\) with equality if and only if \(\forall x \in \mathcal {X}\) \(p(x) = q(x)\).

Fact 3

(Theorem 1 in [7]). Let p(x), \(q(x) > 0\) be probability mass functions of two discrete random variables X and Y, respectively, defined on the space \(\mathcal {X}\). Then

$$ D\left( p||q\right) \le \frac{1}{\ln 2} \left( \sum _{x \in \mathcal {X}} \frac{p^{2}(x)}{q(x)} - 1 \right) . $$

Definition 6

(Mutual information). If X and Y are two discrete random variables defined on the spaces \(\mathcal {X}\) and \(\mathcal {Y}\), respectively, then the mutual information of X and Y is defined as

$$\begin{aligned} I\left( X,Y\right) = \sum _{x \in \mathcal {X}} \sum _{y \in \mathcal {Y}} \Pr [X=x, Y=y] \log \left( \frac{\Pr [X=x, Y=y]}{\Pr [X=x] \Pr [Y=y]}\right) . \end{aligned}$$
(7)

Fact 4

For any discrete random variables XY

  • \(0 \le I\left( X,Y\right) \le \min \{H\left( X\right) , H\left( Y\right) \}\) and the first equality holds if and only if random variables X and Y are independent,

  • \(I\left( X,Y\right) = I\left( Y,X\right) = H\left( X\right) - H\left( X|Y\right) = H\left( Y\right) - H\left( Y|X\right) \).

  • \(I\left( X,Y\right) = D\left( p(x,y)||p(x)p(y)\right) \) where p(xy) denotes the joint distribution, and p(x)p(y) the product distribution of X and Y.

Appendix 2: Markov Chains

We recall some definitions and facts from the theory of Markov chains. They can be found e.g. in [5, 13, 15]. Unless otherwise stated, we will consider only time-homogeneous chains, where transition probabilities do not change with time.

Definition 7

(Total variation distance). For probability distributions \(\mu \) and \(\nu \) on the space \(\mathcal {X}\) we define the total variation distance between \(\mu \) and \(\nu \) as \(d_{\mathrm {TV}}\left( \mu , \nu \right) = \max _{A \subseteq \mathcal {X}} |\mu (A) - \nu (A)|\).

Fact 5

Let \(\mu \) and \(\nu \) be two probability distributions on common space \(\mathcal {X}\). Then we have \(d_{\mathrm {TV}}\left( \mu , \nu \right) = \frac{1}{2} \sum _{x \in \mathcal {X}} |\mu (x) - \nu (x)|\).

Definition 8

Let \(P^{t}(x_0, \cdot )\) denote the distribution of an ergodic Markov chain on finite space \(\mathcal {X}\) in step t when starting in the state \(x_0\). Let \(\pi \) be the stationary distribution of M. We define \(d(t) = \max _{x \in \mathcal {X}} d_{\mathrm {TV}}\left( P^{t}(x, \cdot ), \pi \right) \) and \(\bar{d}(t) = \max _{x, y \in \mathcal {X}} d_{\mathrm {TV}}\left( P^{t}(x, \cdot ), P^{t}(y, \cdot )\right) \).

Fact 6

Let \(\mathcal {P}\) be the family of all probability distributions on \(\mathcal {X}\). Then

  • \(d(t) \le \bar{d}(t) \le 2 d(t)\),

  • \(d(t) = \sup _{\mu \in \mathcal {P}} d_{\mathrm {TV}}\left( \mu P^{t}, \pi \right) = \sup _{\mu , \nu \in \mathcal {P}} d_{\mathrm {TV}}\left( \mu P^{t}, \nu P^{t}\right) \).

Definition 9

(Mixing time). For an ergodic Markov chain M on finite space \(\mathcal {X}\) we define the mixing time as \(t_{\mathrm {mix}}\left( \varepsilon \right) = \min \{t :d(t) \le \varepsilon \}\) and \(t_{\mathrm {mix}}= t_{\mathrm {mix}}\left( 1/4\right) \).

Fact 7

For any \(\varepsilon > 0\), \(t_{\mathrm {mix}}\left( \varepsilon \right) \le \left\lfloor \log \varepsilon ^{-1} \right\rfloor t_{\mathrm {mix}}\).

Definition 10

(Random walk). The random walk on a graph \(G = (V,E)\) with n nodes and m edges is a Markov chain on V with transition probabilities

$$ p_{ij} = \Pr [X_{t+1} = v_j | X_t = v_i] = {\left\{ \begin{array}{ll} 1/\mathrm {deg}\left( v_i\right) , &{} if \, \{v_i,v_j\} \in E,\\ 0, &{} otherwise. \end{array}\right. } $$

The lazy random walk is the random walk which, in every time t, with probability 1/2 remains in current vertex or performs one step of a simple random walk.

The following Fact 8 gives an upper bound on the mixing time for random walks. It follows e.g. from Theorem 10.14 in [13] and the properties of cover time and its relation to mixing time (see [11]).

Fact 8

For a lazy random walk on an arbitrary connected graph G with n vertices \(t_{\mathrm {mix}}= {{\mathrm{O}}}\left( n^3\right) .\)

Fact 9

(cf. [4, 5, 22]). Let \(M = (X_0, X_1, \ldots )\) be an ergodic Markov chain on finite space \(\mathcal {X}\) with transition matrix P and stationary distribution \(\pi \).

  • For any two probability distributions \(\mu \) and \(\nu \) on space \(\mathcal {X}\) the relative entropy \(D\left( \mu P^{t}||\nu P^{t}\right) \) decreases with t, i.e. \(D\left( \mu P^{t}||\nu P^{t}\right) \ge D\left( \mu P^{t+1}||\nu P^{t+1}\right) \).

  • For any initial distribution \(\mu \) the relative entropy \(D\left( \mu P^{t}||\pi \right) \) decreases with t. Furthermore, \(\lim _{t \rightarrow \infty } D\left( \mu P^{t}||\pi \right) = 0\).

  • The conditional entropy \(H\left( X_0|X_t\right) \) is increasing in t.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gotfryd, K., Klonowski, M., Pająk, D. (2017). On Location Hiding in Distributed Systems. In: Das, S., Tixeuil, S. (eds) Structural Information and Communication Complexity. SIROCCO 2017. Lecture Notes in Computer Science(), vol 10641. Springer, Cham. https://doi.org/10.1007/978-3-319-72050-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-72050-0_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-72049-4

  • Online ISBN: 978-3-319-72050-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics