Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3579375.3579392acmotherconferencesArticle/Chapter ViewAbstractPublication PagesacswConference Proceedingsconference-collections
research-article

ICB FL: Implicit Class Balancing Towards Fairness in Federated Learning

Published: 13 March 2023 Publication History

Abstract

Federated learning (FL) is a promising machine learning paradigm that allows many clients jointly train a model without sharing the raw data. As the standard FL has been designed from the server’s perspective, the unfairness issue may occur throughout the whole learning process including the global model optimization phase. Some existing works have attempted this issue to guarantee the global model achieves a similar accuracy across different classes (i.e., labels), but failed to consider the implicit classes (different representations of one label) under them in which the fairness issue persists. In this paper, we focus on the fairness issue in the global model optimization phase and mitigate the research gap by introducing the Implicit Class Balancing (ICB) Federated Learning framework with Single Class Training Scheme (SCTS). In ICB FL, the server first broadcasts the current global model and assigns a particular class (label) for each client. Then, each client locally trains the model only with the assigned class data (SCTS) and sends the gradient back to the server. The server subsequently performs unsupervised learning to identify the implicit classes and generates the balanced weight for each client. Finally, the server averages the gradient received with weights, and updates the global model. We evaluate our ICB FL in three datasets, and the experimental results show that our ICB FL can effectively enhance fairness across explicit and implicit classes.

References

[1]
Patrick Bolton and Mathias Dewatripont. 2004. Contract theory. MIT press.
[2]
Li Deng. 2012. The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE signal processing magazine 29, 6 (2012), 141–142.
[3]
John A Hartigan and Manchek A Wong. 1979. Algorithm AS 136: A k-means clustering algorithm. Journal of the royal statistical society. series c (applied statistics) 28, 1(1979), 100–108.
[4]
Tiansheng Huang, Weiwei Lin, Wentai Wu, Ligang He, Keqin Li, and Albert Y. Zomaya. 2021. An Efficiency-Boosting Client Selection Scheme for Federated Learning With Fairness Guarantee. IEEE Transactions on Parallel and Distributed Systems 32, 7 (2021), 1552–1564. https://doi.org/10.1109/TPDS.2020.3040887
[5]
Peter Kairouz, H Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, 2021. Advances and open problems in federated learning. Foundations and Trends® in Machine Learning 14, 1–2(2021), 1–210.
[6]
Jiawen Kang, Zehui Xiong, Dusit Niyato, Han Yu, Ying-Chang Liang, and Dong In Kim. 2019. Incentive design for efficient federated learning in mobile networks: A contract theory approach. In 2019 IEEE VTS Asia Pacific Wireless Communications Symposium (APWCS). IEEE, 1–5.
[7]
Tian Li, Anit Kumar Sahu, Ameet Talwalkar, and Virginia Smith. 2020. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine 37, 3 (2020), 50–60.
[8]
Tian Li, Maziar Sanjabi, Ahmad Beirami, and Virginia Smith. 2019. Fair Resource Allocation in Federated Learning. In International Conference on Learning Representations.
[9]
Yanli Li, Abubakar Sadiq Sani, Dong Yuan, and Wei Bao. 2022. Enhancing Federated Learning Robustness Through clustering Non-IID Features. In Proceedings of the Asian Conference on Computer Vision. 41–55.
[10]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics. PMLR, 1273–1282.
[11]
Mehryar Mohri, Gary Sivek, and Ananda Theertha Suresh. 2019. Agnostic federated learning. In International Conference on Machine Learning. PMLR, 4615–4625.
[12]
Zhendong Song, Hongguang Sun, Howard H. Yang, Xijun Wang, Yan Zhang, and Tony Q. S. Quek. 2022. Reputation-Based Federated Learning for Secure Wireless Networks. IEEE Internet of Things Journal 9, 2 (2022), 1212–1226. https://doi.org/10.1109/JIOT.2021.3079104
[13]
MA Syakur, BK Khotimah, EMS Rochman, and Budi Dwi Satoto. 2018. Integration k-means clustering method and elbow method for identification of the best customer profile cluster. In IOP conference series: materials science and engineering, Vol. 336. IOP Publishing, 012017.
[14]
Hao Wang, Zakhary Kaplan, Di Niu, and Baochun Li. 2020. Optimizing federated learning on non-iid data with reinforcement learning. In IEEE INFOCOM 2020-IEEE Conference on Computer Communications. IEEE, 1698–1707.
[15]
Lixu Wang, Shichao Xu, Xiao Wang, and Qi Zhu. 2021. Addressing class imbalance in federated learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 10165–10173.
[16]
Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. arXiv preprint arXiv:1708.07747(2017).
[17]
Zhaoping Xiong, Ziqiang Cheng, Xinyuan Lin, Chi Xu, Xiaohong Liu, Dingyan Wang, Xiaomin Luo, Yong Zhang, Hualiang Jiang, Nan Qiao, 2022. Facing small and biased data dilemma in drug discovery with enhanced federated learning approaches. Science China Life Sciences 65, 3 (2022), 529–539.
[18]
Qiang Yang, Yang Liu, Tianjian Chen, and Yongxin Tong. 2019. Federated machine learning: Concept and applications. ACM Transactions on Intelligent Systems and Technology (TIST) 10, 2(2019), 1–19.
[19]
Qiang Yang, Yang Liu, Yong Cheng, Yan Kang, Tianjian Chen, and Han Yu. 2019. Federated learning. Synthesis Lectures on Artificial Intelligence and Machine Learning 13, 3(2019), 1–207.
[20]
Felix Yu, Ankit Singh Rawat, Aditya Menon, and Sanjiv Kumar. 2020. Federated learning with only positive labels. In International Conference on Machine Learning. PMLR, 10946–10956.
[21]
Han Yu, Zelei Liu, Yang Liu, Tianjian Chen, Mingshu Cong, Xi Weng, Dusit Niyato, and Qiang Yang. 2020. A fairness-aware incentive scheme for federated learning. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society. 393–399.
[22]
Muhammad Bilal Zafar, Isabel Valera, Manuel Gomez Rodriguez, and Krishna P Gummadi. 2017. Fairness beyond disparate treatment & disparate impact: Learning classification without disparate mistreatment. In Proceedings of the 26th international conference on world wide web. 1171–1180.
[23]
Qunsong Zeng, Yuqing Du, Kaibin Huang, and Kin K Leung. 2021. Energy-efficient resource management for federated edge learning with CPU-GPU heterogeneous computing. IEEE Transactions on Wireless Communications 20, 12(2021), 7947–7962.
[24]
Jingfeng Zhang, Cheng Li, Antonio Robles-Kelly, and Mohan Kankanhalli. 2020. Hierarchically fair federated learning. arXiv preprint arXiv:2004.10386(2020).
[25]
Hangyu Zhu, Jinjin Xu, Shiqing Liu, and Yaochu Jin. 2021. Federated learning on non-IID data: A survey. Neurocomputing 465(2021), 371–390. https://doi.org/10.1016/j.neucom.2021.07.098

Cited By

View all
  • (2024)Accelerating Asynchronous Federated Learning Convergence via Opportunistic Mobile RelayingIEEE Transactions on Vehicular Technology10.1109/TVT.2024.338406173:7(10668-10680)Online publication date: Jul-2024

Index Terms

  1. ICB FL: Implicit Class Balancing Towards Fairness in Federated Learning
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ACSW '23: Proceedings of the 2023 Australasian Computer Science Week
      January 2023
      272 pages
      ISBN:9798400700057
      DOI:10.1145/3579375
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 13 March 2023

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Clustering
      2. Fairness
      3. Federated Learning

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      ACSW 2023
      ACSW 2023: 2023 Australasian Computer Science Week
      January 30 - February 3, 2023
      VIC, Melbourne, Australia

      Acceptance Rates

      Overall Acceptance Rate 61 of 141 submissions, 43%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)19
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 03 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Accelerating Asynchronous Federated Learning Convergence via Opportunistic Mobile RelayingIEEE Transactions on Vehicular Technology10.1109/TVT.2024.338406173:7(10668-10680)Online publication date: Jul-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media