Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues

Published: 23 February 2024 Publication History

Abstract

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.

References

[1]
Statista. (2019). Number of Internet of Things (IoT) Connected Devices Worldwide From 2019 To 2021, With Forecasts From 2022 To 2030. Accessed: Dec. 9, 2022. [Online]. Available: https://www.statista.com/statistics/1183457/iot-connected-devices-worldwide/
[2]
B. McMahan et al., “Communication-efficient learning of deep networks from decentralized data,” in Proc. AISTATS, 2017, pp. 1273–1282.
[3]
A. Tak and S. Cherkaoui, “Federated edge learning: Design issues and challenges,” IEEE Netw., vol. 35, no. 2, pp. 252–258, Mar. 2021.
[4]
E. Jeong et al., “Communication-efficient on-device machine learning: Federated distillation and augmentation under non-IID private data,” in Proc. NIPS Workshop, 2018, pp. 1–6.
[5]
M. Nasr, R. Shokri, and A. Houmansadr, “Comprehensive privacy analysis of deep learning: Passive and active white-box inference attacks against centralized and federated learning,” in Proc. IEEE Symp. Secur. Privacy (SP), May 2019, pp. 739–753.
[6]
L. Zhang et al., “Fine-tuning global model via data-free knowledge distillation for non-IID federated learning,” in Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit., May 2022, pp. 10174–10183.
[7]
D. Sui et al., “FedED: Federated learning via ensemble distillation for medical relation extraction,” in Proc. Conf. Empirical Methods Natural Lang. Process. (EMNLP), 2020, pp. 2118–2128.
[8]
J.-H. Ahn, O. Simeone, and J. Kang, “Wireless federated distillation for distributed edge learning with heterogeneous data,” in Proc. IEEE 30th Annu. Int. Symp. Pers., Indoor Mobile Radio Commun. (PIMRC), Sep. 2019, pp. 1–6.
[9]
T. Lin et al., “Ensemble distillation for robust model fusion in federated learning,” in Proc. Adv. Neural Inf. Process. Syst., vol. 33, 2020, pp. 1–26.
[10]
Z. Zhu, J. Hong, and J. Zhou, “Data-free knowledge distillation for heterogeneous federated learning,” in Proc. Int. Conf. Mach. Learn., 2021, pp. 12878–12889.
[11]
X. Gong et al., “Ensemble attention distillation for privacy-preserving federated learning,” in Proc. IEEE/CVF Int. Conf. Comput. Vis. (ICCV), Oct. 2021, pp. 15056–15066.
[12]
N. Wang et al., “A blockchain based privacy-preserving federated learning scheme for Internet of Vehicles,” Digit. Commun. Netw., to be published. /j.dcan.2022.05.020.
[13]
G. D. Putra et al., “Toward blockchain-based trust and reputation management for trustworthy 6G networks,” IEEE Netw., vol. 36, no. 4, pp. 112–119, Jul. 2022.
[14]
C. Hazay and M. Venkitasubramaniam, “Scalable multi-party private set-intersection,” in Proc. IACR Int. Workshop Public Key Cryptogr., vol. 10174, 2017, pp. 175–203.
[15]
S. Kamara et al., “Scaling private set intersection to billion-element sets,” in Proc. 18th Int. Conf. Financial Cryptogr. Data Secur., Christ Church, Barbados, 2014, pp. 195–215.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Network: The Magazine of Global Internetworking
IEEE Network: The Magazine of Global Internetworking  Volume 38, Issue 4
July 2024
296 pages

Publisher

IEEE Press

Publication History

Published: 23 February 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 12 Jan 2025

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media