Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Energy-efficient Personalized Federated Search with Graph for Edge Computing

Published: 09 September 2023 Publication History

Abstract

Federated Learning (FL) is a popular method for privacy-preserving machine learning on edge devices. However, the heterogeneity of edge devices, including differences in system architecture, data, and co-running applications, can significantly impact the energy efficiency of FL. To address these issues, we propose an energy-efficient personalized federated search framework. This framework has three key components. Firstly, we search for partial models with high inference efficiency to reduce training energy consumption and the occurrence of stragglers in each round. Secondly, we build lightweight search controllers that control the model sampling and respond to runtime variances, mitigating new straggler issues caused by co-running applications. Finally, we design an adaptive search update strategy based on graph aggregation to improve personalized training convergence. Our framework reduces the energy consumption of the training process by lowering the training overhead of each round and speeding up the training convergence rate. Experimental results show that our approach achieves up to 5.02% accuracy and 3.45× energy efficiency improvements.

References

[1]
Monsoon. High voltage power monitor. (n.d.). https://www.msoon.com/high-voltage-power-monitor
[2]
H. Cai, L. Zhu, and S. Han. 2019. ProxylessNAS: Direct neural architecture search on target task and hardware. In The International Conference on Learning Representations (ICLR’19).
[3]
S. Caldas, S. M. K. Duddu, P. Wu, T. Li, J. Konečný, H. B. McMahan, V. Smith, and A. Talwalkar. 2018. Leaf: A benchmark for federated settings. In arXiv preprint arXiv:1812.01097.
[4]
Y. J. Y. Chen and X. Sun. 2019. Communication-efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation. IEEE Transactions on Neural Networks and Learning Systems 31, 10 (2019), 4229–4238.
[5]
Y. Deng, M. M. Kamani, and M. Mahdavi. 2020. Adaptive personalized federated learning. In arXiv:2003.13461.
[6]
M. Duan, D. Liu, X. Ji, R. Liu, L. Liang, X. Chen, and Y. Tan. 2021. Fedgroup: Efficient federated learning via decomposed similarity-based clustering. In IEEE International Conference on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom).
[7]
C. He, M. Annavaram, and S. Avestimehr. 2020. Towards non-i.i.d. and invisible data with fednas: Federated deep learning via neural architecture search. In CVPR 2020 Workshop on Neural Architecture Search and Beyond for Representation Learning.
[8]
Y. Jiang, S. Wang, V. Valls, B. J. Ko, W. H. Lee, and K. K. Leung. 2021. Model pruning enables efficient federated learning on edge devices. IEEE Transactions on Neural Networks and Learning Systems (2021).
[9]
P. Kairouz, H. B. McMahan, B. Avent, A. Bellet, M. Bennis, A. N. Bhagoji, K. Bonawitz, Z. Charles, G. Cormode, R. Cummings, et al.2021. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning 14, 1-2 (2021), 1–210.
[10]
T. N. Kipf and M. Welling. 2017. Semi-supervised classification with graph convolutional networks. In The International Conference on Learning Representations (ICLR’17).
[11]
D. Li and J. Wang. 2019. Fedmd: Heterogenous federated learning via model distillation. In arXiv:1910.03581.
[12]
Q. Li, Z. Wen, Z. Wu, S. Hu, N. Wang, and B. He. 2021. A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Transactions on Knowledge and Data Engineering (2021).
[13]
T. Li, A. K. Sahu, A. Talwalkar, and V. Smith. 2020. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine 37, 3 (2020), 50–60.
[14]
P. P. Liang, T. Liu, Z. Liu, N. B. Allen, R. P. Auerbach, D. Brent, R. Salakhutdinov, and L. P. Morency. 2020. Think locally, act globally: Federated learning with local and global representations. In NeurIPS.
[15]
J. Luo, J. Yang, X. Ye, X. Guo, and W. Zhao. 2021. Fedskel: Efficient federated learning on heterogeneous systems with skeleton gradients update. In Conference on Information and Knowledge Management (CIKM’21).
[16]
X. Ma, J. Zhang, S. Guo, and W. Xu. 2022. Layer-wised model aggregation for personalized federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR’22). 10092–10101.
[17]
H. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. Artificial Intelligence and Statistics (2017), 1273–1282.
[18]
T. Nishio and R. Yonetani. 2017. Client selection for federated learning with heterogeneous resources in mobile edge. In IEEE International Conference on Communications (ICC’17).
[19]
H. Pham, M. Guan, B. Zoph, Q. Le, and J. Dean. 2018. Efficient neural architecture search via parameters sharing. In International Conference on Machine Learning (ICML’18).
[20]
E. Seo, D. Niyato, and E. Elmroth. 2022. Resource-efficient federated learning with non-iid data: An auction theoretic approach. IEEE Internet of Things Journal 9, 24 (2022), 25506–25524.
[21]
D. Shingari, A. Arunkumar, B. Gaudette, S. Vrudhula, and C.-J. Wu. 2018. Dora: Optimizing smartphone energy efficiency and web browser performance under interference. In IEEE International Symposium on Performance Analysis of Systems and Software (ISPASS’18).
[22]
I. Singh, H. Zhou, K. Yang, M. Ding, B. Lin, and P. Xie. 2020. Differentially-private federated neural architecture search. In arXiv:2006.10559.
[23]
V. Smith, C. Chiang, M. Sanjabi, and A. Talwalkar. 2017. Federated multi-task learning. In Advances in Neural Information Processing Systems.
[24]
Z. Wang, B. Che, L. Guo, Y. Du, Y. Chen, J. Zhao, and W. He. 2022. PipeFL: Hardware/software co-design of an FPGA accelerator for federated learning. IEEE Access 10 (2022), 98649–98661.
[25]
D. Wen, K. Jeon, and K. Huang. 2021. Federated dropout – a simple approach for enabling federated learning on resource constrained devices. In arXiv:2109.15258.
[26]
M. Xu, Y. Zhao, K. Bian, G. Huang, Q. Mei, and X. Liu. 2020. Federated neural architecture search. In arXiv:2002.06352.
[27]
Z. Xu, F. Yu, J. Xiong, and X. Chen. 2021. Helios: Heterogeneity-aware federated learning with dynamically balanced collaboration. In Design Automation Conference (DAC’21).
[28]
Z. Yang, S. Hu, and K. Chen. 2020. FPGA-based hardware accelerator of homomorphic encryption for efficient federated learning. In arXiv:2007.10560.
[29]
S. Yu, P. Nguyen, W. Abebe, J. Stanley, P. Munoz, and A. Jannesari. 2022. Resource-aware heterogeneous federated learning using neural architecture search. In arXiv:2211.05716.
[30]
C. Zhang, X. Yuan, Q. Zhang, G. Zhu, L. Cheng, and N. Zhang. 2022. Toward tailored models on private aiot devices: Federated direct neural architecture search. IEEE Internet of Things Journal 9, 18 (2022), 17309–17322.
[31]
J. Zhang, X. Chen, W. Wang, L. Yang, J. Hu, and K. Chen. 2023. {FLASH}: Towards a high-performance hardware acceleration architecture for cross-silo federated learning. In 20th USENIX Symposium on Networked Systems Design and Implementation (NSDI’23).
[32]
J. Zhang, S. Guo, Z. Qu, D. Zeng, Y. Zhan, Q. Liu, and R. Akerkar. 2021. Adaptive federated learning on non-iid data with resource constraint. IEEE Trans. Comput. 71, 7 (2021), 1655–1667.
[33]
X. Zhang, Y. Li, W. Li, K. Guo, and Y. Shao. 2022. Personalized federated learning via variational bayesian inference. In International Conference on Machine Learning (ICML’22). 26293–26310.
[34]
H. Zhu and Y. Jin. 2020. Real-time federated evolutionary neural architecture search. In arXiv:2003.02793.
[35]
B. Zoph, V. Vasudevan, J. Shlens, and Q. V. Le. 2018. Learning transferable architectures for scalable image recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR’18).

Cited By

View all
  • (2024)Personalized Search Engine Optimization for E-Commerce Platforms Based on Content Filtering Algorithm2024 IEEE 3rd World Conference on Applied Intelligence and Computing (AIC)10.1109/AIC61668.2024.10730815(636-641)Online publication date: 27-Jul-2024

Index Terms

  1. Energy-efficient Personalized Federated Search with Graph for Edge Computing

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Embedded Computing Systems
      ACM Transactions on Embedded Computing Systems  Volume 22, Issue 5s
      Special Issue ESWEEK 2023
      October 2023
      1394 pages
      ISSN:1539-9087
      EISSN:1558-3465
      DOI:10.1145/3614235
      • Editor:
      • Tulika Mitra
      Issue’s Table of Contents

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Journal Family

      Publication History

      Published: 09 September 2023
      Accepted: 30 June 2023
      Revised: 02 June 2023
      Received: 23 March 2023
      Published in TECS Volume 22, Issue 5s

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Personalized FL
      2. federated search
      3. energy-efficient
      4. graph-based aggregation

      Qualifiers

      • Research-article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)121
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 24 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Personalized Search Engine Optimization for E-Commerce Platforms Based on Content Filtering Algorithm2024 IEEE 3rd World Conference on Applied Intelligence and Computing (AIC)10.1109/AIC61668.2024.10730815(636-641)Online publication date: 27-Jul-2024

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media