Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3576915.3616585acmconferencesArticle/Chapter ViewAbstractPublication PagesccsConference Proceedingsconference-collections
research-article

Control, Confidentiality, and the Right to be Forgotten

Published: 21 November 2023 Publication History
  • Get Citation Alerts
  • Abstract

    Recent digital rights frameworks give users the right to delete their data from systems that store and process their personal information (e.g., the "right to be forgotten" in the GDPR).
    How should deletion be formalized in complex systems that interact with many users and store derivative information? We argue that prior approaches fall short. Definitions of machine unlearning[6] are too narrowly scoped and do not apply to general interactive settings. The natural approach of deletion-as-confidentiality[15] is too restrictive: by requiring secrecy of deleted data, it rules out social functionalities.
    We propose a new formalism: deletion-as-control. It allows users' data to be freely used before deletion, while also imposing a meaningful requirement after deletion--thereby giving users more control.
    Deletion-as-control provides new ways of achieving deletion in diverse settings. We apply it to social functionalities, and give a new unified view of various machine unlearning definitions from the literature. This is done by way of a new adaptive generalization of history independence.
    Deletion-as-control also provides a new approach to the goal of machine unlearning, that is, to maintaining a model while honoring users' deletion requests. We show that publishing a sequence of updated models that are differentially private under continual release satisfies deletion-as-control. The accuracy of such an algorithm does not depend on the number of deleted points, in contrast to the machine unlearning literature.

    References

    [1]
    A. Achille, M. Kearns, C. Klingenberg, and S. Soatto. Ai model disgorgement: Methods and choices. arXiv preprint arXiv:2304.03545, 2023.
    [2]
    M. Altman, A. Cohen, K. Nissim, and A. Wood. What a hybrid legal-technical analysis teaches us about privacy regulation: The case of singling out. BUJ Sci. & Tech. L., 27:1, 2021.
    [3]
    R. Bassily, A. Smith, and A. Thakurta. Private empirical risk minimization: Efficient algorithms and tight error bounds. In 2014 IEEE 55th annual symposium on foundations of computer science, pages 464--473. IEEE, 2014.
    [4]
    G. E. Blelloch and D. Golovin. Strongly history-independent hashing with applications. In 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS), 2007.
    [5]
    L. Bourtoule, V. Chandrasekaran, C. A. Choquette-Choo, H. Jia, A. Travers, B. Zhang, D. Lie, and N. Papernot. Machine unlearning. In 42nd IEEE Symposium on Security and Privacy, 2021.
    [6]
    Y. Cao and J. Yang. Towards making systems forget with machine unlearning. In 2015 IEEE Symposium on Security and Privacy, pages 463--480. IEEE, 2015.
    [7]
    N. Carlini, F. Tramer, E. Wallace, M. Jagielski, A. Herbert-Voss, K. Lee, A. Roberts, T. Brown, D. Song, U. Erlingsson, et al. Extracting training data from large language models. In 30th USENIX Security Symposium (USENIX Security 21), pages 2633--2650, 2021.
    [8]
    T.-H. H. Chan, E. Shi, and D. Song. Private and continual release of statistics. ACM Transactions on Information and System Security (TISSEC), 14(3):1--24, 2011.
    [9]
    A. Cohen and K. Nissim. Linear program reconstruction in practice. Journal of Privacy and Confidentiality, 10(1), Jan. 2020. URL https: //journalprivacyconfidentiality.org/index.php/jpc/article/view/711.
    [10]
    A. Cohen, A. Smith, M. Swanberg, and P. N. Vasudevan. Control, confidentiality, and the right to be forgotten. arXiv preprint arXiv:2210.07876, 2022.
    [11]
    Court of Justice of the European Union. Press release no 70/14. https://curia. europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf, May 2013.
    [12]
    C. Dwork, M. Naor, T. Pitassi, and G. N. Rothblum. Differential privacy under continual observation. In L. J. Schulman, editor, Proceedings of the 42nd ACM Symposium on Theory of Computing, STOC 2010, pages 715--724. ACM, 2010.
    [13]
    C. Dwork, M. Naor, T. Pitassi, G. N. Rothblum, and S. Yekhanin. Pan-private streaming algorithms. In ICS, pages 66--80, 2010.
    [14]
    B. Fowler. Data breaches break record in 2021. CNET, 2022. URL https://www.cnet.com/news/privacy/record-number-of-data-breaches-reported-in-2021-new-report-says/.
    [15]
    S. Garg, S. Goldwasser, and P. N. Vasudevan. Formalizing data deletion in the context of the right to be forgotten. In Advances in Cryptology-EUROCRYPT 2020: 39th Annual International Conference on the Theory and Applications of Cryptographic Techniques, Zagreb, Croatia, May 10-14, 2020, Proceedings, Part II 30, pages 373--402. Springer, 2020.
    [16]
    A. Ginart, M. Guan, G. Valiant, and J. Y. Zou. Making ai forget you: Data deletion in machine learning. In Advances in Neural Information Processing Systems, pages 3513--3526, 2019.
    [17]
    J. Godin and P. Lamontagne. Deletion-compliance in the absence of privacy. In 2021 18th International Conference on Privacy, Security and Trust (PST), pages 1--10. IEEE, 2021.
    [18]
    A. Golatkar, A. Achille, and S. Soatto. Eternal sunshine of the spotless net: Selective forgetting in deep networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9304--9312, 2020.
    [19]
    A. Golatkar, A. Achille, and S. Soatto. Forgetting outside the box: Scrubbing deep networks of information accessible from input-output observations. arXiv preprint arXiv:2003.02960, 2020.
    [20]
    Google. Right to be forgotten overview. https://support.google.com/legal/answer/ 10769224. Accessed: 2023-04-18.
    [21]
    Google. Helping public health officials combat covid-19. https://blog.google/ technology/health/covid-19-community-mobility-reports/, 2020. Accessed: 2023-04-18.
    [22]
    C. Guo, T. Goldstein, A. Hannun, and L. van der Maaten. Certified data removal from machine learning models, 2020.
    [23]
    V. Gupta, C. Jung, S. Neel, A. Roth, S. Sharifi-Malvajerdi, and C. Waites. Adaptive machine unlearning. Advances in Neural Information Processing Systems, 34: 16319--16330, 2021.
    [24]
    J. D. Hartline, E. S. Hong, A. E. Mohr, W. R. Pentney, and E. C. Rocke. Character-izing history independent data structures. Algorithmica, 42(1):57--74, 2005.
    [25]
    P. Jain, S. Raskhodnikova, S. Sivakumar, and A. D. Smith. The price of differential privacy under continual observation. CoRR, abs/2112.00828, 2021. URL https: //arxiv.org/abs/2112.00828.
    [26]
    JASON. Consistency of data products and formal privacy methods for the 2020 Census. Panel Report JSR 21-02, JASON, The MITRE Corporation, 1 2022.
    [27]
    P. Kairouz, B. McMahan, S. Song, O. Thakkar, A. Thakurta, and Z. Xu. Practical and private (deep) learning without sampling or shuffling. In M. Meila and T. Zhang, editors, Proceedings of the 38th International Conference on Machine Learning (ICML), 2021.
    [28]
    G. King and N. Persily. Unprecedented facebook urls dataset now available for academic research through social science one. https: //socialscience.one/blog/unprecedented-facebook-urls-dataset-now-available-research-through-social-science-one, 2020. Accessed: 2023-04-18.
    [29]
    D. Micciancio. Oblivious data structures: applications to cryptography. In Proceedings of the twenty-ninth annual ACM symposium on Theory of computing, pages 456--464, 1997.
    [30]
    M. Naor and V. Teague. Anti-persistence: History independent data structures. In Proceedings of the thirty-third annual ACM symposium on Theory of computing, pages 492--501, 2001.
    [31]
    National Conference of State Legislatures. State laws related to digital privacy. https://www.ncsl.org/technology-and-communication/state-laws-related-to-digital-privacy, June 2022. Accessed: 2023-04-18.
    [32]
    S. Neel, A. Roth, and S. Sharifi-Malvajerdi. Descent-to-delete: Gradient-based methods for machine unlearning, 2020.
    [33]
    A. Ng. Homeland Security records show 'shocking' use of phone data, ACLU says. Politico, 2022. URL https://www.politico.com/news/2022/07/18/dhs-location-data-aclu-00046208.
    [34]
    K. Nissim, A. Bembenek, A. Wood, M. Bun, M. Gaboardi, U. Gasser, D. R. O'Brien, T. Steinke, and S. Vadhan. Bridging the gap between computer science and legal approaches to privacy. Harv. JL & Tech., 31:687, 2017.
    [35]
    Y. Polyanskiy. Two fundamental probabilistic models. https://ocw.mit.edu/ courses/6-436j-fundamentals-of-probability-fall-2018/resources/mit6_436jf18_ lec02/, 2018. Accessed: 2023-04-18.
    [36]
    A. Sekhari, J. Acharya, G. Kamath, and A. T. Suresh. Remember what you want to forget: Algorithms for machine unlearning. arXiv preprint arXiv:2103.03279, 2021.
    [37]
    R. K. Slaughter, J. Kopec, and M. Batal. Algorithms and economic justice: A taxonomy of harms and a path forward for the federal trade commission. Yale JL & Tech., 23:1, 2020.
    [38]
    A. Thudi, H. Jia, I. Shumailov, and N. Papernot. On the necessity of auditable algorithmic definitions for machine unlearning. In 31st USENIX Security Symposium (USENIX Security 22), pages 4007--4022, 2022.
    [39]
    E. Ullah, T. Mai, A. Rao, R. Rossi, and R. Arora. Machine unlearning via algorithmic stability. arXiv preprint arXiv:2102.13179, 2021.
    [40]
    US Census Bureau. Why the Census Bureau chose differential privacy. Technical Report C2020BR-03, US Census Bureau, March 2023.
    [41]
    S. Vadhan and W. Zhang. Concurrent composition theorems for all standard variants of differential privacy. arXiv preprint arXiv:2207.08335, 2022. To appear in the 55th Annual ACM Symposium on Theory of Computing (STOC), 2023.

    Cited By

    View all
    • (2024)The Right to Be Zero-Knowledge ForgottenProceedings of the 19th International Conference on Availability, Reliability and Security10.1145/3664476.3669973(1-9)Online publication date: 30-Jul-2024

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CCS '23: Proceedings of the 2023 ACM SIGSAC Conference on Computer and Communications Security
    November 2023
    3722 pages
    ISBN:9798400700507
    DOI:10.1145/3576915
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 November 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. deletion
    2. differential privacy
    3. history independence
    4. machine unlearning

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CCS '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,210 of 6,719 submissions, 18%

    Upcoming Conference

    CCS '24
    ACM SIGSAC Conference on Computer and Communications Security
    October 14 - 18, 2024
    Salt Lake City , UT , USA

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)250
    • Downloads (Last 6 weeks)23
    Reflects downloads up to

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)The Right to Be Zero-Knowledge ForgottenProceedings of the 19th International Conference on Availability, Reliability and Security10.1145/3664476.3669973(1-9)Online publication date: 30-Jul-2024

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media