Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
survey

Scenario-based Adaptations of Differential Privacy: A Technical Survey

Published: 26 April 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Differential privacy has been a de facto privacy standard in defining privacy and handling privacy preservation. It has had great success in scenarios of local data privacy and statistical dataset privacy. As a primitive definition, standard differential privacy has been adapted to a wide range of practical scenarios. In this work, we summarize differential privacy adaptations in specific scenarios and analyze the correlations between data characteristics and differential privacy design. We mainly present them in two lines including differential privacy adaptations in local data privacy and differential privacy adaptations in statistical dataset privacy. With a focus on differential privacy design, this survey targets providing guiding rules in differential privacy design for scenarios, together with identifying potential opportunities to adaptively apply differential privacy in more emerging technologies and further improve differential privacy itself with the assistance of cryptographic primitives.

    References

    [1]
    Jayadev Acharya, Kallista Bonawitz, Peter Kairouz, Daniel Ramage, and Ziteng Sun. 2020. Context aware local differential privacy. In 37th International Conference on Machine Learning (ICML), Vol. 119. 52–62.
    [2]
    Jayadev Acharya, Ziteng Sun, and Huanyu Zhang. 2019. Hadamard response: Estimating distributions privately, efficiently, and with little communication. In 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), Vol. 89. 1120–1129.
    [3]
    Naman Agarwal, Peter Kairouz, and Ziyu Liu. 2021. The Skellam mechanism for differentially private federated learning. In 35th Conference on Neural Information Processing Systems (NeurIPS). arXiv:2110.04995
    [4]
    Mohammad Alaggan, Sébastien Gambs, and Anne-Marie Kermarrec. 2016. Heterogeneous differential privacy. Journal of Privacy and Confidentiality (JPC) 7, 2 (2016), 127–158.
    [5]
    Miguel E. Andrés, Nicolás E. Bordenabe, Konstantinos Chatzikokolakis, and Catuscia Palamidessi. 2013. Geo-indistinguishability: Differential privacy for location-based systems. In 20th ACM SIGSAC Conference on Computer and Communications Security (CCS). 901–914.
    [6]
    Brendan Avent, Aleksandra Korolova, David Zeber, Torgeir Hovden, and Benjamin Livshits. 2017. BLENDER: Enabling local search with a hybrid differential privacy model. In 26th USENIX Security Symposium. 747–764.
    [7]
    Jordan Awan and Aleksandra Slavković. 2021. Structure and sensitivity in differential privacy: Comparing k-norm mechanisms. Journal of the American Statistical Association (JASA) 116, 534 (2021), 935–954.
    [8]
    Victor Balcer and Albert Cheu. 2020. Separating local & shuffled differential privacy via histograms. In 1st Conference on Information-theoretic Cryptography (ITC), Vol. 163. 1:1–1:14.
    [9]
    Borja Balle, James Bell, Adria Gascón, and Kobbi Nissim. 2019. The privacy blanket of the shuffle model. In 39th International Cryptology Conference (CRYPTO), Vol. 11693. 638–667.
    [10]
    Borja Balle, James Bell, Adria Gascón, and Kobbi Nissim. 2020. Private summation in the multi-message shuffle model. In 27th ACM SIGSAC Conference on Computer and Communications Security (CCS). 657–676.
    [11]
    Nuray Baltaci Akhuseyinoglu and Kamil Akhuseyinoglu. 2022. Context adaptive personalized privacy for location-based systems. In 4th International Workshop on Adaptive and Personalized Privacy and Security (APPS) in Conjunction with 30th ACM Conference on User Modeling, Adaptation and Personalization (ACM UMAP). 101–108.
    [12]
    Ergute Bao, Yin Yang, Xiaokui Xiao, and Bolin Ding. 2021. CGM: An enhanced mechanism for streaming data collection with local differential privacy. Proceedings of the VLDB Endowment 14, 11 (2021), 2258–2270.
    [13]
    Raef Bassily and Adam Smith. 2015. Local, private, efficient protocols for succinct histograms. In 47th ACM Symposium on Theory of Computing (STOC). 127–135.
    [14]
    Amos Beimel and Aleksandra Korolova. 2020. The power of synergy in differential privacy: Combining a small curator with local randomizers. In 1st Information-theoretic Cryptography Conference (ITC), Vol. 163. 14:1–14:25.
    [15]
    Ari Biswas and Graham Cormode. 2022. Verifiable differential privacy for when the curious become dishonest. arXiv:2208.09011
    [16]
    Andrea Bittau, Úlfar Erlingsson, Petros Maniatis, Ilya Mironov, Ananth Raghunathan, David Lie, Mitch Rudominer, Ushasree Kode, Julien Tinnes, and Bernhard Seefeld. 2017. Prochlo: Strong privacy for analytics in the crowd. In 26th Symposium on Operating Systems Principles (SOSP). 441–459.
    [17]
    Jeremiah Blocki, Avrim Blum, Anupam Datta, and Or Sheffet. 2013. Differentially private data analysis of social networks via restricted sensitivity. In 4th Conference on Innovations in Theoretical Computer Science. 87–96.
    [18]
    Jonas Böhler and Florian Kerschbaum. 2020. Secure multi-party computation of differentially private median. In 29th USENIX Security Symposium. 2147–2164.
    [19]
    Kuntai Cai, Xiaoyu Lei, Jianxin Wei, and Xiaokui Xiao. 2021. Data synthesis via differentially private Markov random fields. Proceedings of the VLDB Endowment 14, 11 (2021), 2190–2202.
    [20]
    Clément L. Canonne, Gautam Kamath, and Thomas Steinke. 2020. The discrete Gaussian for differential privacy. In 34th Conference on Neural Information Processing Systems (NeurIPS). 15676–15688.
    [21]
    Xiaoyu Cao, Jinyuan Jia, and Neil Zhenqiang Gong. 2021. Data poisoning attacks to local differential privacy protocols. In 30th USENIX Security Symposium. 947–964.
    [22]
    Thee Chanyaswad, Alex Dytso, H. Vincent Poor, and Prateek Mittal. 2018. MVG mechanism: Differential privacy under matrix-valued query. In 25th ACM SIGSAC Conference on Computer and Communications Security (CCS). 230–246.
    [23]
    Konstantinos Chatzikokolakis, Catuscia Palamidessi, and Marco Stronati. 2015. Constructing elastic distinguishability metrics for location privacy. Proceedings on Privacy Enhancing Technologies (PoPETs) 2015, 2 (2015), 156–170.
    [24]
    Michelle Chen and Olga Ohrimenko. 2022. Protecting global properties of datasets with distribution privacy mechanisms. arXiv:2207.08367
    [25]
    Albert Cheu. 2021. Differential privacy in the shuffle model: A survey of separations. arXiv:2107.11839
    [26]
    Albert Cheu, Adam Smith, and Jonathan Ullman. 2021. Manipulation attacks in local differential privacy. In 42nd IEEE Symposium on Security and Privacy (Oakland), IEEE, 883–900.
    [27]
    Albert Cheu, Adam Smith, Jonathan Ullman, David Zeber, and Maxim Zhilyaev. 2019. Distributed differential privacy via shuffling. In 38th International Conference on the Theory and Applications of Cryptographic Techniques, Vol. 11476. 375–403.
    [28]
    Steve Chien, Prateek Jain, Walid Krichene, Steffen Rendle, Shuang Song, Abhradeep Thakurta, and Li Zhang. 2021. Private alternating least squares: Practical private matrix completion with tighter rates. In 38th International Conference on Machine Learning (ICML). 1877–1887.
    [29]
    Jorge Cortés, Geir E. Dullerud, Shuo Han, Jerome Le Ny, Sayan Mitra, and George J. Pappas. 2016. Differential privacy in control and network systems. In 55th IEEE Conference on Decision and Control (CDC). 4252–4272.
    [30]
    Panayiotis Danassis, Aleksei Triastcyn, and Boi Faltings. 2022. A distributed differentially private algorithm for resource allocation in unboundedly large settings. In 21th International Conference on Autonomous Agents and MultiAgent Systems (AAMAS). 327–335.
    [31]
    Damien Desfontaines and Balázs Pejó. 2020. SoK: Differential privacies. Proceedings on Privacy Enhancing Technologies (PoPETs) 2020, 2 (2020), 288–313.
    [32]
    Prathamesh Dharangutte, Jie Gao, Ruobin Gong, and Fang-Yi Yu. 2023. Integer subspace differential privacy. In 37th AAAI Conference on Artificial Intelligence (AAAI). arXiv:2212.00936
    [33]
    Bolin Ding, Janardhan Kulkarni, and Sergey Yekhanin. 2017. Collecting telemetry data privately. In 31st Conference on Neural Information Processing Systems (NeurIPS), Vol. 30. 3571–3580.
    [34]
    Ni Ding. 2022. Kantorovich mechanism for pufferfish privacy. In 25th International Conference on Artificial Intelligence and Statistics (AISTATS). arXiv:2201.07388
    [35]
    Zeyu Ding, Daniel Kifer, Thomas Steinke, Yuxin Wang, Yingtai Xiao, and Danfeng Zhang. 2021. The permute-and-flip mechanism is identical to report-noisy-max with exponential noise. arXiv:2105.07260
    [36]
    Zeyu Ding, Yuxin Wang, Yingtai Xiao, Guanhong Wang, Danfeng Zhang, and Daniel Kifer. 2022. Free gap estimates from the exponential mechanism, sparse vector, noisy max and related algorithms. The VLDB Journal (2022).
    [37]
    Irit Dinur and Kobbi Nissim. 2003. Revealing information while preserving privacy. In 22nd ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems (PODS). 202–210.
    [38]
    Rafael G. L. D.’Oliveira, Muriel Médard, and Parastoo Sadeghi. 2021. Differential privacy for binary functions via randomized graph colorings. In 18th IEEE International Symposium on Information Theory (ISIT). 473–478.
    [39]
    Wei Dong and Ke Yi. 2021. Residual sensitivity for differentially private multi-way joins. In 47th ACM SIGMOD International Conference on Management of Data (SIGMOD). 432–444.
    [40]
    Linkang Du, Zhikun Zhang, Shaojie Bai, Changchang Liu, Shouling Ji, Peng Cheng, and Jiming Chen. 2021. AHEAD: Adaptive hierarchical decomposition for range query under local differential privacy. In 28th ACM SIGSAC Conference on Computer and Communications Security (CCS). 1266–1288.
    [41]
    John C. Duchi, Michael I. Jordan, and Martin J. Wainwright. 2013. Local privacy and statistical minimax rates. In 54th IEEE Annual Symposium on Foundations of Computer Science (FOCS). 429–438.
    [42]
    David Durfee and Ryan M. Rogers. 2019. Practical differentially private top-k selection with pay-what-you-get composition. In 33rd Conference on Neural Information Processing Systems (NeurIPS), Vol. 32. 3532–3542.
    [43]
    Cynthia Dwork. 2008. Differential privacy: A survey of results. In 11th International Conference on Theory and Applications of Models of Computation (TAMC). 1–19.
    [44]
    Cynthia Dwork. 2021. Differential privacy in distributed environments: An overview and open questions. In 40th ACM Symposium on Principles of Distributed Computing (PODS). 5.
    [45]
    Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. 2006. Calibrating noise to sensitivity in private data analysis. In 3rd Theory of Cryptography Conference (TCC), Vol. 3876. 265–284.
    [46]
    Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. 2016. Calibrating noise to sensitivity in private data analysis. Journal of Privacy and Confidentiality (JPC) 7, 3 (2016), 17–51.
    [47]
    Cynthia Dwork, Guy N. Rothblum, and Salil Vadhan. 2010. Boosting and differential privacy. In 51st IEEE Symposium on Foundations of Computer Science (STOC). 51–60.
    [48]
    Cynthia Dwork, Adam Smith, Thomas Steinke, and Jonathan Ullman. 2017. Exposed! A survey of attacks on private data. Annual Review of Statistics and Its Application 4 (2017), 61–84.
    [49]
    Wisam Eltarjaman, Rinku Dewri, and Ramakrishna Thurimella. 2017. Location privacy for rank-based geo-query systems. Proceedings on Privacy Enhancing Technologies (PoPETs) 2017, 4 (2017), 77–96.
    [50]
    Úlfar Erlingsson, Vitaly Feldman, Ilya Mironov, Ananth Raghunathan, Kunal Talwar, and Abhradeep Thakurta. 2019. Amplification by shuffling: From local to central differential privacy via anonymity. In 30th ACM-SIAM Symposium on Discrete Algorithms (SODA). 2468–2479.
    [51]
    Úlfar Erlingsson, Vasyl Pihur, and Aleksandra Korolova. 2014. RAPPOR: Randomized aggregatable privacy-preserving ordinal response. In 21st ACM SIGSAC Conference on Computer and Communications Security (CCS). 1054– 1067.
    [52]
    Liyue Fan. 2020. A survey of differentially private generative adversarial networks. In AAAI Workshop on Privacy-preserving Artificial Intelligence.
    [53]
    Liyue Fan and Luca Bonomi. 2018. Time series sanitization with metric-based privacy. In 6th IEEE International Congress on Big Data (BigData Congress). 264–267.
    [54]
    Natasha Fernandes, Yusuke Kawamoto, and Takao Murakami. 2021. Locality sensitive hashing with extended differential privacy. In 26th European Symposium on Research in Computer Security (ESORICS), Vol. 12973. 563–583.
    [55]
    Natasha Fernandes, Annabelle McIver, Catuscia Palamidessi, and Ming Ding. 2022. Universal optimality and robust utility bounds for metric differential privacy. In 35th IEEE Computer Security Foundations Symposium (CSF). arXiv:2205.01258
    [56]
    Joseph Ficek, Wei Wang, Henian Chen, Getachew Dagne, and Ellen Daley. 2021. A survey of differentially private regression for clinical and epidemiological research. International Statistical Review 89, 1 (2021), 132–147.
    [57]
    Chen Gao, Chao Huang, Dongsheng Lin, Depeng Jin, and Yong Li. 2020. DPLCF: Differentially private local collaborative filtering. In 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR). 961–970.
    [58]
    Jie Gao, Ruobin Gong, and Fang-Yi Yu. 2022. Subspace differential privacy. In 36th AAAI Conference on Artificial Intelligence (AAAI). 3986–3995.
    [59]
    Gonzalo Munilla Garrido and Florian Matthes. 2022. Exponential randomized response: Boosting utility in differentially private selection. arXiv:2201.03913
    [60]
    Chang Ge, Shubhankar Mohapatra, Xi He, and Ihab F. Ilyas. 2021. Kamino: Constraint-aware differentially private data synthesis. Proceedings of the VLDB Endowment 14, 10 (2021), 1886–1899.
    [61]
    Badih Ghazi, Noah Golowich, Ravi Kumar, Pasin Manurangsi, Rasmus Pagh, and Ameya Velingker. 2020. Pure differentially private summation from anonymous messages. In 1st Conference on Information-theoretic Cryptography (ITC), Vol. 163. 15:1–15:23.
    [62]
    Badih Ghazi, Noah Golowich, Ravi Kumar, Rasmus Pagh, and Ameya Velingker. 2021. On the power of multiple anonymous messages: Frequency estimation and selection in the shuffle model of differential privacy. In 40th International Conference on the Theory and Applications of Cryptographic Techniques (EUROCRYPT), Vol. 12698. 463–488.
    [63]
    Badih Ghazi, Pasin Manurangsi, Rasmus Pagh, and Ameya Velingker. 2020. Private aggregation from fewer anonymous messages. In 39th International Conference on the Theory and Applications of Cryptographic Techniques (EUROCRYPT), Vol. 12106. 798–827.
    [64]
    Arpita Ghosh, Tim Roughgarden, and Mukund Sundararajan. 2012. Universally utility-maximizing privacy mechanisms. SIAM Journal on Computing 41, 6 (2012), 1673–1693.
    [65]
    Parham Gohari, Bo Wu, Calvin Hawkins, Matthew T. Hale, and Ufuk Topcu. 2021. Differential privacy on the unit simplex via the Dirichlet mechanism. IEEE Transactions on Information Forensics and Security (TIFS) 16 (2021), 2326–2340.
    [66]
    Ruobin Gong and Xiao-Li Meng. 2020. Congenial differential privacy under mandated disclosure. In 1st ACM-IMS Foundations of Data Science Conference (FODS). 59–70.
    [67]
    Xiaolan Gu, Ming Li, Yang Cao, and Li Xiong. 2019. Supporting both range queries and frequency estimation with local differential privacy. In 7th IEEE Conference on Communications and Network Security (CNS). 124–132.
    [68]
    Xiaolan Gu, Ming Li, Li Xiong, and Yang Cao. 2020. Providing input-discriminative protection for local differential privacy. In 36th IEEE International Conference on Data Engineering (ICDE). 505–516.
    [69]
    Yuzhou Gu, Ziqi Zhou, Onur Günlü, Rafael G. L. D’Oliveira, Parastoo Sadeghi, Muriel Médard, and Rafael F. Schaefer. 2023. Generalized rainbow differential privacy. arXiv:2309.05871
    [70]
    Mehmet Emre Gursoy, Acar Tamersoy, Stacey Truex, Wenqi Wei, and Ling Liu. 2021. Secure and utility-aware data collection with condensed local differential privacy. IEEE Transactions on Dependable and Secure Computing (TDSC) 18, 5 (2021), 2365–2378.
    [71]
    Rob Hall, Alessandro Rinaldo, and Larry Wasserman. 2012. Random differential privacy. Journal of Privacy and Confidentiality 4, 2 (2012), 43–59.
    [72]
    Samuel Haney, William Sexton, Ashwin Machanavajjhala, Michael Hay, and Gerome Miklau. 2021. Differentially private algorithms for 2020 decennial census detailed DHC race & ethnicity. In 2021 Workshop on Theory and Practice of Differential Privacy (TPDP).
    [73]
    Moritz Hardt and Kunal Talwar. 2010. On the geometry of differential privacy. In 42nd ACM Symposium on Theory of Computing (STOC). 705–714.
    [74]
    Xi He, Ashwin Machanavajjhala, and Bolin Ding. 2014. Blowfish privacy: Tuning privacy-utility trade-offs using policies. In 40th ACM SIGMOD International Conference on Management of Data (SIGMOD). 1447–1458.
    [75]
    Xi He, Ashwin Machanavajjhala, Cheryl Flynn, and Divesh Srivastava. 2017. Composing differential privacy and secure computation: A case study on scaling private record linkage. In 24th ACM SIGSAC Conference on Computer and Communications Security (CCS). 1389–1406.
    [76]
    Ryota Hiraishi, Masatoshi Yoshikawa, Shun Takagi, Yang Cao, Sumio Fujita, and Hidehito Gomi. 2022. Mitigating privacy vulnerability caused by map asymmetry. In 36th IFIP Annual WG 11.3 Conference on Data and Applications Security and Privacy (DBSec). 68–86.
    [77]
    Masooma Iftikhar, Qing Wang, and Yang Li. 2022. dk-Personalization: Publishing network statistics with personalized differential privacy. In 26th Pacific-Asia Conference Advances in Knowledge Discovery and Data Mining (PAKDD), Vol. 13280. 194–207.
    [78]
    Jacob Imola, Shiva Kasiviswanathan, Stephen White, Abhinav Aggarwal, and Nathanael Teissier. 2022. Balancing utility and scalability in metric differential privacy. In 38th Conference on Uncertainty in Artificial Intelligence (UAI). https://openreview.net/forum?id=B0l8-wLjql5
    [79]
    Prateek Jain, Om Dipakbhai Thakkar, and Abhradeep Thakurta. 2018. Differentially private matrix completion revisited. In 35th International Conference on Machine Learning (ICML). 2215–2224.
    [80]
    Tianxi Ji, Pan Li, Emre Yilmaz, Erman Ayday, Yanfang Ye, and Jinyuan Sun. 2021. Differentially private binary- and matrix-valued data query: An XOR mechanism. Proceedings of the VLDB Endowment 14, 5 (2021), 849–862.
    [81]
    Honglu Jiang, Jian Pei, Dongxiao Yu, Jiguo Yu, Bei Gong, and Xiuzhen Cheng. 2021. Applications of differential privacy in social network analysis: A survey. IEEE Transactions on Knowledge and Data Engineering (TKDE) 35, 1 (2021), 108–127. 10.1109/TKDE.2021.3073062
    [82]
    Wuxuan Jiang, Cong Xie, and Zhihua Zhang. 2016. Wishart mechanism for differentially private principal components analysis. In 30th AAAI Conference on Artificial Intelligence (AAAI). 1730–1736.
    [83]
    Zach Jorgensen, Ting Yu, and Graham Cormode. 2015. Conservative or liberal? Personalized differential privacy. In 31st IEEE International Conference on Data Engineering (ICDE). 1023–1034.
    [84]
    Peter Kairouz, Keith Bonawitz, and Daniel Ramage. 2016. Discrete distribution estimation under local privacy. In 33rd International Conference on Machine Learning (ICML), Vol. 48. 2436–2444.
    [85]
    Parameswaran Kamalaruban, Victor Perrier, Hassan Jameel Asghar, and Mohamed Ali Kaafar. 2020. Not all attributes are created equal: \(d_{\mathcal {X}}\) -private mechanisms for linear queries. Proceedings on Privacy Enhancing Technologies (PoPETs) 2020, 1 (2020), 103–125.
    [86]
    Fumiyuki Kato, Yang Cao, and Masatoshi Yoshikawa. 2021. Preventing manipulation attack in local differential privacy using verifiable randomization mechanism. In 35th Annual WG 11.3 Conference on Data and Applications Security and Privacy (DBSec), Vol. 12840. 43–60.
    [87]
    Yusuke Kawamoto and Takao Murakami. 2019. Local obfuscation mechanisms for hiding probability distributions. In 24th European Symposium on Research in Computer Security (ESORICS), Vol. 11735. 128–148.
    [88]
    Michael Kearns, Mallesh Pai, Aaron Roth, and Jonathan Ullman. 2014. Mechanism design in large games: Incentives and privacy. In 5th Conference on Innovations in Theoretical Computer Science (ITCS). 403–410.
    [89]
    Georgios Kellaris, Stavros Papadopoulos, Xiaokui Xiao, and Dimitris Papadias. 2014. Differentially private event sequences over infinite streams. Proceedings of the VLDB Endowment 7, 12 (2014), 1155–1166.
    [90]
    Daniel Kifer and Ashwin Machanavajjhala. 2014. Pufferfish: A framework for mathematical privacy definitions. ACM Transactions on Database Systems (TODS) 39, 1 (2014), 1–36.
    [91]
    Ios Kotsogiannis, Stelios Doudalis, Sam Haney, Ashwin Machanavajjhala, and Sharad Mehrotra. 2020. One-sided differential privacy. In 36th IEEE International Conference on Data Engineering (ICDE). 493–504.
    [92]
    Fragkiskos Koufogiannis and George J. Pappas. 2016. Location-dependent privacy. In 55th IEEE Conference on Decision and Control (CDC). 7586–7591.
    [93]
    Fragkiskos Koufogiannis and George J. Pappas. 2016. Multi-owner multi-user privacy. In 55th Conference on Decision and Control (CDC). 1787–1793.
    [94]
    Fragkiskos Koufogiannis and George J. Pappas. 2017. Diffusing private data over networks. IEEE Transactions on Control of Network Systems 5, 3 (2017), 1027–1037.
    [95]
    Peeter Laud and Alisa Pankova. 2022. Interpreting epsilon of differential privacy in terms of advantage in guessing or approximating sensitive attributes. In 35th IEEE Computer Security Foundations Symposium (CSF).arXiv:1911.12777
    [96]
    Peeter Laud, Alisa Pankova, and Martin Pettai. 2020. A framework of metrics for differential privacy from local sensitivity. Proceedings on Privacy Enhancing Technologies (PoPETs) 2020, 2 (2020), 175–208.
    [97]
    Jaewoo Lee and Chris Clifton. 2011. How much is enough? Choosing \(\epsilon\) for differential privacy. In 14th International Conference on Information Security (ISC). 325–340.
    [98]
    Haoran Li, Li Xiong, Zhanglong Ji, and Xiaoqian Jiang. 2017. Partitioning-based mechanisms under personalized differential privacy. In 21st Pacific-Asia Conference Advances in Knowledge Discovery and Data Mining (PAKDD), Vol. 10234. 615–627.
    [99]
    Jingjie Li, Amrita Roy Chowdhury, Kassem Fawaz, and Younghyun Kim. 2021. Kal \(\varepsilon\) ido: Real-time privacy control for eye-tracking systems. In 30th USENIX Security Symposium. 1793–1810.
    [100]
    Xiaoguang Li, Neil Zhenqiang Gong, Ninghui Li, Wenhai Sun, and Hui Li. 2022. Fine-grained poisoning attacks to local differential privacy protocols for mean and variance estimation. arXiv:2205.11782
    [101]
    Changchang Liu, Supriyo Chakraborty, and Prateek Mittal. 2016. Dependence makes you vulnberable: Differential Privacy under dependent tuples. In 23rd Annual Network and Distributed System Security Symposium (NDSS), Vol. 16. 21–24.
    [102]
    Changchang Liu, Supriyo Chakraborty, and Prateek Mittal. 2017. Deeprotect: Enabling inference-based access control on mobile sensing applications. arXiv:1702.06159
    [103]
    Naurang S. Mangat. 1994. An improved randomized response strategy. Journal of the Royal Statistical Society: Series B (Methodological) 56, 1 (1994), 93–95.
    [104]
    Krystal Maughan and Joseph P. Near. 2022. Improving utility for privacy-preserving analysis of correlated columns using pufferfish privacy. arXiv:2209.10908
    [105]
    Ryan McKenna, Daniel Sheldon, and Gerome Miklau. 2019. Graphical-model based estimation and inference for differential privacy. In 36th International Conference on Machine Learning (ICML). 4435–4444.
    [106]
    Ryan McKenna and Daniel R. Sheldon. 2020. Permute-and-Flip: A new mechanism for differentially private selection. In 34th Conference on Neural Information Processing Systems (NeurIPS). 193–203.
    [107]
    Casey Meehan and Kamalika Chaudhuri. 2021. Location trace privacy under conditional priors. In 24th International Conference on Artificial Intelligence and Statistics (AISTATS), Vol. 130. 2881–2889.
    [108]
    Tamara T. Mueller, Dmitrii Usynin, Johannes C. Paetzold, Daniel Rueckert, and Georgios Kaissis. 2022. SoK: Differential privacy on graph-structured data. arXiv:2203.09205
    [109]
    Gonzalo Munilla Garrido, Johannes Sedlmeir, and Matthias Babel. 2022. Towards verifiable differentially-private polling. In 17th International Conference on Availability, Reliability and Security (ARES). arXiv:2206.07220
    [110]
    Takao Murakami and Yusuke Kawamoto. 2019. Utility-optimized local differential privacy mechanisms for distribution estimation. In 28th USENIX Security Symposium. 1877–1894.
    [111]
    Takao Murakami and Yuichi Sei. 2023. Automatic tuning of privacy budgets in input-discriminative local differential privacy. IEEE Internet of Things Journal 10, 18 (2023), 15990–16005. DOI:10.1109/JIOT.2023.3267082
    [112]
    Takao Murakami and Kenta Takahashi. 2021. Toward evaluating re-identification risks in the local privacy model. Transactions on Data Privacy 14 (2021), 1–37.
    [113]
    Priyanka Nanayakkara, Johes Bater, Xi He, Jessica Hullman, and Jennie Rogers. 2022. Visualizing privacy-utility trade-offs in differentially private data releases. Proceedings on Privacy Enhancing Technologies (PoPETs) 2022, 2 (2022), 601–618.
    [114]
    Arjun Narayan and Andreas Haeberlen. 2012. DJoin: Differentially private join queries over distributed databases. In 10th USENIX Symposium on Operating Systems Design and Implementation (OSDI). 149–162.
    [115]
    Boel Nelson and Jenni Reuben. 2020. SoK: Chasing accuracy and privacy, and catching both in differentially private histogram publication. Transactions on Data Privacy 13, 3 (2020), 201–245.
    [116]
    Kobbi Nissim, Sofya Raskhodnikova, and Adam Smith. 2007. Smooth sensitivity and sampling in private data analysis. In 39th ACM Symposium on Theory of Computing (STOC). 75–84.
    [117]
    Ben Niu, Yahong Chen, Boyang Wang, Zhibo Wang, Fenghua Li, and Jin Cao. 2021. AdaPDP: Adaptive personalized differential privacy. In 40th IEEE Conference on Computer Communications (INFOCOM). 1–10.
    [118]
    Shuying Qin, Jianping He, Chongrong Fang, and James Lam. 2022. Differential private discrete noise adding mechanism: Conditions and properties. In 22nd American Control Conference (ACC). arXiv:2203.10323
    [119]
    Youyang Qu, Shui Yu, Wanlei Zhou, Shiping Chen, and Jun Wu. 2021. Customizable reliable privacy-preserving data sharing in cyber-physical social network. IEEE Transactions on Network Science and Engineering 8, 1 (2021), 269–281.
    [120]
    Matthew Reimherr, Karthik Bharath, and Carlos Soto. 2021. Differential privacy over Riemannian manifolds. In 35th Conference on Neural Information Processing Systems (NeurIPS). arXiv:2111.02516
    [121]
    Amrita Roy Chowdhury, Chenghong Wang, Xi He, Ashwin Machanavajjhala, and Somesh Jha. 2020. Crypt \(\epsilon\) : Crypto-assisted differential privacy on untrusted servers. In 46th ACM SIGMOD International Conference on Management of Data. 603–619.
    [122]
    Parastoo Sadeghi, Shahab Asoodeh, and Flavio du Pin Calmon. 2020. Differentially private mechanisms for count queries. arXiv:2007.09374
    [123]
    Anand Sarwate. 2017. Retraction for symmetric matrix perturbation for differentially-private principal component analysis. In International Conference on Acoustics, Speech, and Signal Processing. https://ergodicity.net/2017/04/07/retraction-for-symmetric-matrix-perturbation-for-differentially-private-principal-component-analysis-icassp-2016/
    [124]
    Jeremy Seeman, Aleksandra Slavković, and Matthew Reimherr. 2022. Formal privacy for partially private data. arXiv:2204.01102
    [125]
    Tomer Shoham and Yosef Rinott. 2021. Adjusting queries to statistical procedures under differential privacy. arXiv:2110.05895
    [126]
    Shuang Song, Yizhen Wang, and Kamalika Chaudhuri. 2017. Pufferfish privacy mechanisms for correlated data. In 43rd ACM SIGMOD International Conference on Management of Data (SIGMOD). 1291–1306.
    [127]
    Jordi Soria-Comas, Josep Domingo-Ferrer, David Sánchez, and David Megías. 2017. Individual differential privacy: A utility-preserving formulation of differential privacy guarantees. IEEE Transactions on Information Forensics and Security (TIFS) 12, 6 (2017), 1418–1429.
    [128]
    Carlos Soto, Karthik Bharath, Matthew Reimherr, and Aleksandra Slavkovic. 2022. Shape and structure preserving differential privacy. In 36th Conference on Neural Information Processing Systems (NeurIPS). arXiv:2209.12667
    [129]
    Shun Takagi, Yang Cao, Yasuhito Asano, and Masatoshi Yoshikawa. 2019. Geo-graph-indistinguishability: Protecting location privacy for LBS over road networks. In 33rd IFIP Annual WG 11.3 Conference on Data and Applications Security and Privacy (DBSec). 143–163.
    [130]
    Shun Takagi, Yang Cao, and Masatoshi Yoshikawa. 2020. Poster: Data collection via local differential privacy with secret parameters. In 15th ACM Asia Conference on Computer and Communications Security (AsiaCCS). 910–912.
    [131]
    Shun Takagi, Yang Cao, and Masatoshi Yoshikawa. 2022. Asymmetric differential privacy. In 10th IEEE International Conference on Big Data (Big Data). arXiv:2103.00996
    [132]
    Differential Privacy Team. 2017. Learning with Privacy at Scale. https://machinelearning.apple.com/research/learning-with-privacy-at-scale
    [133]
    Saiteja Utpala, Praneeth Vepakomma, and Nina Miolane. 2022. Differentially private fréchet mean on the manifold of symmetric positive definite (SPD) matrices. arXiv:2208.04245
    [134]
    Xuan-Son Vu and Lili Jiang. 2018. Self-adaptive privacy concern detection for user-generated content. In 19th International Conference on Computational Linguistics and Intelligent Text Processing (CICLing). arXiv:1806.07221
    [135]
    Sameer Wagh, Xi He, Ashwin Machanavajjhala, and Prateek Mittal. 2021. DP-cryptography: Marrying differential privacy and cryptography in emerging applications. Communications of the ACM 64, 2 (2021), 84–93.
    [136]
    Jun Wang, Shubo Liu, and Yongkai Li. 2015. A review of differential privacy in individual data release. International Journal of Distributed Sensor Networks 11 (2015), 259682:1–259682:18.
    [137]
    Qian Wang, Yan Zhang, Xiao Lu, Zhibo Wang, Zhan Qin, and Kui Ren. 2016. RescueDP: Real-time spatio-temporal crowd-sourced data publishing with differential privacy. In 35th IEEE International Conference on Computer Communications (INFOCOM). 1–9.
    [138]
    Shaowei Wang, Liusheng Huang, Yiwen Nie, Xinyuan Zhang, Pengzhan Wang, Hongli Xu, and Wei Yang. 2019. Local differential private data aggregation for discrete distribution estimation. IEEE Transactions on Parallel and Distributed Systems (TPDS) 30, 9 (2019), 2046–2059.
    [139]
    Tianhao Wang, Jeremiah Blocki, Ninghui Li, and Somesh Jha. 2017. Locally differentially private protocols for frequency estimation. In 26th USENIX Security Symposium. 729–745.
    [140]
    Tianhao Wang, Bolin Ding, Min Xu, Zhicong Huang, Cheng Hong, Jingren Zhou, Ninghui Li, and Somesh Jha. 2020. Improving utility and security of the shuffler-based differential privacy. Proceedings of the VLDB Endowment 13, 13 (2020), 3545–3558.
    [141]
    Zhibo Wang, Wenxin Liu, Xiaoyi Pang, Ju Ren, Zhe Liu, and Yongle Chen. 2020. Towards pattern-aware privacy-preserving real-time data collection. In 39th IEEE International Conference on Computer Communications (INFOCOM). 109–118.
    [142]
    Stanley L. Warner. 1965. Randomized response: A survey technique for eliminating evasive answer bias. Journal of the American Statistical Association 60, 309 (1965), 63–69.
    [143]
    Benjamin Weggenmann and Florian Kerschbaum. 2021. Differential privacy for directional data. In 28th ACM SIGSAC Conference on Computer and Communications Security (CCS). 1205–1222.
    [144]
    Yongji Wu, Xiaoyu Cao, Jinyuan Jia, and Neil Zhenqiang Gong. 2022. Poisoning attacks to local differential privacy protocols for key-value data. In 31th USENIX Security Symposium. arXiv:2111.11534
    [145]
    Yonghui Xiao and Li Xiong. 2015. Protecting locations with differential privacy under temporal correlations. In 22nd ACM SIGSAC Conference on Computer and Communications Security (CCS). 1298–1309.
    [146]
    Nan Xu, Oluwaseyi Feyisetan, Abhinav Aggarwal, Zekun Xu, and Nathanael Teissier. 2021. Density-aware differentially private textual perturbations using truncated Gumbel noise. Proceedings of FLAIRS Conference 34, 1 (2021), 1–8.
    [147]
    Bin Yang, Issei Sato, and Hiroshi Nakagawa. 2015. Bayesian differential privacy on correlated data. In 41st ACM SIGMOD International Conference on Management of Data (SIGMOD). 747–762.
    [148]
    Jungang Yang, Liyao Xiang, Ruidong Chen, Weiting Li, and Baochun Li. 2022. Differential privacy for tensor-valued queries. IEEE Transactions on Information Forensics and Security (TIFS) 17 (2022), 152–164.
    [149]
    Xinyu Yang, Teng Wang, Xuebin Ren, and Wei Yu. 2021. Survey on improving data utility in differentially private sequential data publishing. IEEE Transactions on Big Data 7, 4 (2021), 729–749.
    [150]
    Xin Yao, Rui Zhang, and Yanchao Zhang. 2021. Differential privacy-preserving user linkage across online social networks. In 29th IEEE/ACM International Symposium on Quality of Service (IWQoS). DOI:10.1109/IWQOS52092.2021.9521333
    [151]
    Emre Yilmaz, Tianxi Ji, Erman Ayday, and Pan Li. 2021. Genomic data sharing under dependent local differential privacy. In 12th ACM Conference on Data and Application Security and Privacy (CODASPY). arXiv:2102.07357
    [152]
    Chengtao Yong, Yan Huo, Chunqiang Hu, Yanfei Lu, and Guanlin Jing. 2018. A real-time aggregate data publishing scheme with adaptive \(\omega\) -event differential privacy. Mathematical Foundations of Computing 1, 3 (2018), 295.
    [153]
    Yuuya Yoshida and Masahito Hayashi. 2020. Classical mechanism is optimal in classical-quantum differentially private mechanisms. In 17th IEEE International Symposium on Information Theory (ISIT). 1973–1977.
    [154]
    Sepanta Zeighami, Ritesh Ahuja, Gabriel Ghinita, and Cyrus Shahabi. 2022. A neural database for differentially private spatial range queries. Proceedings of the VLDB Endowment (PVLDB) 15, 5 (2022), 1066–1078.
    [155]
    Jun Zhang, Graham Cormode, Cecilia M. Procopiuc, Divesh Srivastava, and Xiaokui Xiao. 2017. Privbayes: Private data release via bayesian networks. ACM Transactions on Database Systems (TODS) 42, 4 (2017), 1–41.
    [156]
    Lefeng Zhang, Tianqing Zhu, Ping Xiong, Wanlei Zhou, and Philip S. Yu. 2021. More than privacy: Adopting differential privacy in game-theoretic mechanism design. ACM Computing Surveys 54, 7 (2021), 1–37.
    [157]
    Meng Zhang, Ermin Wei, Randall Berry, and Jianwei Huang. 2022. Age-dependent differential privacy. arXiv:2209.01466
    [158]
    Tao Zhang, Tianqing Zhu, Renping Liu, and Wanlei Zhou. 2020. Correlated data in differential privacy: Definition and analysis. Concurrency and Computation: Practice and Experience 34 (2020), e6015. arXiv:2008.00180
    [159]
    Wanrong Zhang, Olga Ohrimenko, and Rachel Cummings. 2022. Attribute privacy: Framework and mechanisms. In 5th ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT). arXiv:2009.04013
    [160]
    Ying Zhao and Jinjun Chen. 2022. A survey on differential privacy for unstructured data content. ACM Computing Survey 54 (2022), 1–28. DOI:
    [161]
    Ying Zhao and Jinjun Chen. 2024. Vector-indistinguishability: Location dependency based privacy protection for successive location data. IEEE Transactions on Computers (TC) 73, 4 (2024), 970–979.
    [162]
    Ying Zhao, Dong Yuan, Jia Tina Du, and Jinjun Chen. 2023. Geo-ellipse-indistinguishability: Community-aware location privacy protection for directional distribution. IEEE Transactions on Knowledge and Data Engineering (TKDE) 35, 7 (2023), 6957–6967. DOI:
    [163]
    Qinqing Zheng, Shuxiao Chen, Qi Long, and Weijie Su. 2021. Federated f-differential privacy. In 24th International Conference on Artificial Intelligence and Statistics (AISTATS). 2251–2259.
    [164]
    Li Zhou and Mingsheng Ying. 2017. Differential privacy in quantum computation. In 30th IEEE Computer Security Foundations Symposium (CSF). 249–262.
    [165]
    Ziqi Zhou, Onur Günlü, Rafael G. L. D’Oliveira, Muriel Médard, Parastoo Sadeghi, and Rafael F. Schaefer. 2022. Rainbow differential privacy. In 19th IEEE International Symposium on Information Theory (ISIT). 614–619.
    [166]
    Keyu Zhu, Ferdinando Fioretto, and Pascal Van Hentenryck. 2022. Post-processing of differentially private data: A fairness perspective. In 31st International Joint Conference on Artificial Intelligence (IJCAI). arXiv:2201.09425
    [167]
    Keyu Zhu, Pascal Van Hentenryck, and Ferdinando Fioretto. 2021. Bias and variance of post-processing in differential privacy. In 35th AAAI Conference on Artificial Intelligence (AAAI), Vol. 35. 11177–11184.
    [168]
    Tianqing Zhu, Gang Li, Wanlei Zhou, and Philip S. Yu. 2017. Differentially private data publishing and analysis: A survey. IEEE Transactions on Knowledge and Data Engineering (TKDE) 29, 8 (2017), 1619–1638.
    [169]
    Tianqing Zhu, Ping Xiong, Gang Li, and Wanlei Zhou. 2014. Correlated differential privacy: Hiding information in non-IID data set. IEEE Transactions on Information Forensics and Security (TIFS) 10, 2 (2014), 229–242.
    [170]
    Sumitra Purkayastha. 1991. A rotationally symmetric directional distribution: Obtained through maximum likelihood characterization. Sankhyā: The Indian Journal of Statistics, Series A (1991), 70–83.

    Index Terms

    1. Scenario-based Adaptations of Differential Privacy: A Technical Survey

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Computing Surveys
      ACM Computing Surveys  Volume 56, Issue 8
      August 2024
      963 pages
      ISSN:0360-0300
      EISSN:1557-7341
      DOI:10.1145/3613627
      • Editors:
      • David Atienza,
      • Michela Milano
      Issue’s Table of Contents

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 April 2024
      Online AM: 05 March 2024
      Accepted: 15 December 2023
      Revised: 29 May 2023
      Received: 16 April 2022
      Published in CSUR Volume 56, Issue 8

      Check for updates

      Author Tags

      1. (Local) differential privacy
      2. data privacy

      Qualifiers

      • Survey

      Funding Sources

      • Australian Research Council (ARC)

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 319
        Total Downloads
      • Downloads (Last 12 months)319
      • Downloads (Last 6 weeks)122

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media