Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3534678.3539039acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Felicitas: Federated Learning in Distributed Cross Device Collaborative Frameworks

Published: 14 August 2022 Publication History

Abstract

Felicitas is a distributed cross-device Federated Learning (FL) framework to solve the industrial difficulties of FL in large-scale device deployment scenarios. In Felicitas, FL-Clients are deployed on mobile or embedded devices, while FL-Server is deployed on the cloud platform. We also summarize the challenges of FL deployment in industrial cross-device scenarios (massively parallel, stateless clients, non-use of client identifiers, highly unreliable, unsteady and complex deployment), and provide reliable solutions. We provide the source code and documents at https://www.mindspore.cn/. In addition, the Felicitas has been deployed on mobile phones in real world. At the end of the paper, we demonstrate the validity of the framework through experiments.

References

[1]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics, 54:1273--1282, 2017.
[2]
Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G. L. D'Oliveira, Hubert Eichner, Salim El Rouayheb, David Evans, Josh Gardner, Zachary Garrett, Adrià Gascón, Badih Ghazi, Phillip B. Gibbons, Marco Gruteser, Zaid Harchaoui, Chaoyang He, Lie He, Zhouyuan Huo, Ben Hutchinson, Justin Hsu, Martin Jaggi, Tara Javidi, Gauri Joshi, Mikhail Khodak, Jakub Konecný, Aleksandra Korolova, Farinaz Koushanfar, Sanmi Koyejo, Tancrède Lepoint, Yang Liu, Prateek Mittal, Mehryar Mohri, Richard Nock, Ayfer Özgür, Rasmus Pagh, Hang Qi, Daniel Ramage, Ramesh Raskar, Mariana Raykova, Dawn Song, Weikang Song, Sebastian U. Stich, Ziteng Sun, Ananda Theertha Suresh, Florian Tramèr, Praneeth Vepakomma, Jianyu Wang, Li Xiong, Zheng Xu, Qiang Yang, Felix X. Yu, Han Yu, and Sen Zhao. Advances and Open Problems in Federated Learning. Foundations and Trends® in Machine Learning. 2021.
[3]
Alex Ingerman and Krzys Ostrowski. Tensorflow federated. 2019.
[4]
The FATE Authors. Fate: An industrial grade federated learning framework. 2019.
[5]
Sebastian Caldas, Sai Meher Karthik Duddu, Peter Wu, Tian Li, Jakub Konený, H. Brendan McMahan, Virginia Smith, and Ameet Talwalkar. Leaf: A benchmark for federated settings. arXiv preprint arXiv:1812.01097, 2019.
[6]
Theo Ryffel, Andrew Trask, Morten Dahl, Bobby Wagner, Jason Mancuso, Daniel Rueckert, and Jonathan Passerat-Palmbach. A generic framework for privacy preserving deep learning. arXiv preprint arXiv:1811.04017, 2018.
[7]
Yanjun Ma, Dianhai Yu, Tian Wu, and Haifeng Wang. Paddlepaddle: An open-source deep learning platform from industrial practice. Frontiers of Data and Domputing, 1(1):105, 2019.
[8]
The clara training framework authors. Nvidia clara. 2019.
[9]
Pitch Patarasuk and Xin Yuan. Bandwidth optimal all-reduce algorithms for clusters of workstations. Journal of Parallel and Distributed Computing, 69(2):117--124, 2009.
[10]
John Lamping and Eric Veach. A fast, minimal memory, consistent hash algorithm. arXiv preprint arXiv:1406.2294, 2014.
[11]
Pathum Chamikara Mahawaga Arachchige, Peter Bertok, Ibrahim Khalil, Dongxi Liu, Seyit Camtepe, and Mohammed Atiquzzaman. Local differential privacy for deep learning. IEEE Internet of Things Journal, 7(7):5827--5842, 2020.
[12]
Oded Goldreich. Secure multi-party computation. Manuscript. Preliminary version, 1998.
[13]
Alex Krizhevsky and Geoffrey Hinton. Learning multiple layers of features from tiny images. Handbook of Systemic Autoimmune Diseases, 1(4), 2009.
[14]
The TensorFlow Federated Authors. Tensorflow federated stack overflow dataset. 2019.
[15]
Chaoyang He, Songze Li, Jinhyun So, Xiao Zeng, Mi Zhang, Hongyi Wang, Xiaoyang Wang, Praneeth Vepakomma, Abhishek Singh, Hang Qiu, Xinghua Zhu, Jianzong Wang, Li Shen, Peilin Zhao, Yan Kang, Yang Liu, Ramesh Raskar, Qiang Yang, Murali Annavaram, and Salman Avestimehr. Fedml: A research library and benchmark for federated machine learning. arXiv preprint arXiv:2007.13518, 2020.
[16]
Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konený, Sanjiv Kumar, and H. Brendan McMahan. Adaptive federated optimization. arXiv preprint arXiv:2003.00295, 2020.
[17]
Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, and Radu Soricut. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations. arXiv preprint arXiv:1909.11942, 2020.
[18]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2019.
[19]
Zheng Wang, Xiaoliang Fan, Jianzhong Qi, Chenglu Wen, Cheng Wang, and Rongshan Yu. Federated learning with fair averaging. arXiv preprint arXiv:2104.14937, 2021.
[20]
Jianyu Wang, Qinghua Liu, Hao Liang, Gauri Joshi, and H. Vincent Poor. Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in Neural Information Processing Systems, 33:7611--7623, 2020.
[21]
Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems, 2:429--450, 2020.
[22]
Yujun Lin, Song Han, Huizi Mao, Yu Wang, and William J. Dally. Deep gradient compression: Reducing the communication bandwidth for distributed training. arXiv preprint arXiv:1712.01887, 2020.
[23]
Hanlin Tang, Shaoduo Gan, Ce Zhang, Tong Zhang, and Ji Liu. Communication compression for decentralized training. In Advances in Neural Information Processing Systems, 31, 2018.
[24]
Constantin Philippenko and Aymeric Dieuleveut. Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees. arXiv preprint arXiv:2006.14591, 2021.
[25]
Briland Hitaj, Giuseppe Ateniese, and Fernando Perez-Cruz. Deep models under the gan: information leakage from collaborative deep learning. In Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, 603--618, 2017.
[26]
Qi Zhao, Chuan Zhao, Shujie Cui, Shan Jing, and Zhenxiang Chen. Privatedl: Privacy reserving collaborative deep learning against leakage from gradient sharing. International Journal of Intelligent Systems, 35(8):1262--1279, 2020.
[27]
Milad Nasr, Reza Shokri, and Amir Houmansadr. Comprehensive privacy analysis of deep learning: Passive and active white-box inference attacks against centralized and federated learning. In 2019 IEEE Symposium on Security and Privacy (SP), 739--753, 2019.
[28]
A. Ghosh, J. Chung, Y. Dong, and K. Ramchandran. An efficient framework for clustered federated learning. In Advances in Neural Information Processing Systems, 33:19586--19597, 2020.
[29]
Felix Sattler, Klaus-Robert Müller, and Wojciech Samek. Clustered Federated Learning: Model-Agnostic Distributed Multitask Optimization Under Privacy Constraints. IEEE Transactions on Neural Networks and Learning Systems, 32(8):3710--3722, 2021.
[30]
Ming Xie, Guodong Long, Tao Shen, Tianyi Zhou, Xianzhi Wang, Jing Jiang, and Chengqi Zhang. Multi-Center Federated Learning. arXiv preprint arXiv:2005.01026, 2020.
[31]
Yishay Mansour, Mehryar Mohri, Jae Ro, and Ananda Theertha Suresh. Three approaches for personalization with applications to federated learning. arXiv preprint arXiv:2002.10619, 2020.
[32]
Tian Li, Shengyuan Hu, Ahmad Beirami, and Virginia Smith. Ditto: Fair and Robust Federated Learning Through Personalization. International Conference on Machine Learning, 6357--6368, 2021.
[33]
Alireza Fallah, Aryan Mokhtari, and Asuman Ozdaglar. Personalized federated learning: A meta-learning approach. arXiv preprint arXiv:2002.07948, 2020.
[34]
Yue Tan, Guodong Long, Lu Liu, Tianyi Zhou, Qinghua Lu, Jing Jiang, and Chengqi Zhang. Fedproto: Federated prototype learning over heterogeneous devices. arXiv preprint arXiv:2105.00243, 2021.
[35]
Tao Lin, Lingjing Kong, Sebastian U. Stich, and Martin Jaggi. Ensemble Distillation for Robust Model Fusion in Federated Learning. Advances in Neural Information Processing Systems, 33:2351--2363, 2021.
[36]
Mengwei Xu, Yuxin Zhao, Kaigui Bian, Gang Huang, Qiaozhu Mei, and Xuanzhe Liu. Federated Neural Architecture Search. arXiv preprint arXiv:2002.06352, 2020.

Cited By

View all
  • (2025)Efficient multi-job federated learning scheduling with fault tolerancePeer-to-Peer Networking and Applications10.1007/s12083-024-01847-z18:2Online publication date: 16-Jan-2025
  • (2024)COALAProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3694666(62723-62742)Online publication date: 21-Jul-2024
  • (2024)Survey of Federated Learning Models for Spatial-Temporal Mobility ApplicationsACM Transactions on Spatial Algorithms and Systems10.1145/366608910:3(1-39)Online publication date: 13-Jul-2024
  • Show More Cited By

Index Terms

  1. Felicitas: Federated Learning in Distributed Cross Device Collaborative Frameworks

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      KDD '22: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
      August 2022
      5033 pages
      ISBN:9781450393850
      DOI:10.1145/3534678
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 August 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. cross-device federated learning
      2. data mining under privacy constraints
      3. distributed framework
      4. large-scale

      Qualifiers

      • Research-article

      Conference

      KDD '22
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

      Upcoming Conference

      KDD '25

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)80
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 25 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Efficient multi-job federated learning scheduling with fault tolerancePeer-to-Peer Networking and Applications10.1007/s12083-024-01847-z18:2Online publication date: 16-Jan-2025
      • (2024)COALAProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3694666(62723-62742)Online publication date: 21-Jul-2024
      • (2024)Survey of Federated Learning Models for Spatial-Temporal Mobility ApplicationsACM Transactions on Spatial Algorithms and Systems10.1145/366608910:3(1-39)Online publication date: 13-Jul-2024
      • (2024)Flotta: A Secure and Flexible Spark-Inspired Federated Learning Framework2024 2nd International Conference on Federated Learning Technologies and Applications (FLTA)10.1109/FLTA63145.2024.10840050(156-161)Online publication date: 17-Sep-2024
      • (2023)Efficient Scheduling for Multi-Job Federated Learning Systems with Client Sharing2023 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech)10.1109/DASC/PiCom/CBDCom/Cy59711.2023.10361429(0891-0898)Online publication date: 14-Nov-2023

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media