Abstract
Federated Learning (FL) has received more and more attention from researches and industries in that it can break the data island while protecting the data privacy. However, the original federated average algorithm proposed in FL ignores the difference in distribution of data from multiple participants (widespread in reality), which seriously undermine the performance of deep learning models. It is more serious when the gap of data-volume is large. In this paper, we propose a novel and universal federated learning method, named Fed-Tra, to effectively weaken biases in the model training to build high-precision models. Fed-Tra does this with dynamically adjusting the weight of local training samples for each round in all participants. Moreover, our evaluation on real-world datasets shows that the Fed-Tra achieves nearly +28% improvement of F1-Score compared with the original federated average algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Voigt, P., von dem Bussche, A.: The EU General Data Protection Regulation (GDPR). Springer, Cham (2017). https://doi.org/10.1007/978-3-319-57959-7
Yang, Q., Liu, Y., Chen, T., Tong, Y.: Federated machine learning: concept and applications. ACM Trans. Intell. Syst. Technol. 10(2), 1–19 (2019)
Aledhari, M., Razzak, R., Parizi, R.M., Saeed, F.: Federated learning: a survey on enabling technologies, protocols, and applications. IEEE Access 8, 140699–140725 (2020)
Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z.: On the convergence of fedavg on non-iid data. arXiv preprint arXiv:1907.02189 (2019)
Tian, L., Anit Kumar, S., Manzil, Z., Maziar, S., Ameet, T., Virginia, S.: Federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127 (2018)
Liu, F., Wu, X., Ge, S., Fan, W., Zou, Y. : Federated learning for vision-and-language grounding problems. In: Proceedings of the AAAI Conference on Artificial Intelligence (2018), pp. 11572–11579 (2020)
Hsieh, K., Phanishayee, A., Mutlu, O., Gibbons, P.: The non-iid data quagmire of decentralized machine learning. In: International Conference on Machine Learning, pp. 4387–4398 (2020)
Christopher, B., Zhong, F., Andras, P.: Federated learning with hierarchical clustering of local updates to improve training on non-IID data. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–9 (2020)
Luping, W., Wei, W., Bo, L.I.: CMFL: mitigating communication overhead for federated learning. In: 2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS), pp. 954–964 (2029)
Konečnỳ, J.: Federated learning: strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492 (2016)
Stacey, T., et al.: A hybrid approach to privacy-preserving federated learning. In: Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security, pp. 1–11(2019)
Liu, X., Li, H., Xu, G., Lu, R., He, M.: Adaptive privacy-preserving federated learning. Peer-to-peer Netw. Appl. 13(6), 2356–2366 (2020). https://doi.org/10.1007/s12083-019-00869-2
Lu, Y., Huang, X., Dai, Y., Maharjan, S., Zhang, Y.: Blockchain and federated learning for privacy-preserved data sharing in industrial IoT. IEEE Trans. Ind. Inf. 16(6), 4177–4186 (2019)
Chuan, M.: On safeguarding privacy and security in the framework of federated learning. IEEE Network 34(4), 242–248 (2020)
Pillutla, K., Kakade, S.M., Harchaoui, Z.: Robust aggregation for federated learning. arXiv preprint arXiv:1912.13445 (2019)
Li, L., Xu, W., Chen, T., Giannakis, G.B., Ling, Q.: RSA: byzantine-robust stochastic aggregation methods for distributed learning from heterogeneous datasets. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 1544–1551 (2019)
Chuan, M., et al.: Astraea: self-balancing federated learning for improving classification accuracy of mobile deep learning applications. In: 2019 IEEE 37th International Conference on Computer Design (ICCD), pp. 246–254 (2019)
Liu, Y., Kang, Y., Xing, C., Chen, T., Yang, Q.: A secure federated transfer learning framework. IEEE Intell. Syst. 35(4), 70–82 (2019)
Chen, Y., Qin, X., Wang, J., Yu, C., Gao, W.: Fedhealth: a federated transfer learning framework for wearable healthcare. IEEE Intell. Syst. 35(4), 83–93 (2020)
Kim, H., Park, J., Bennis, M., Kim, S.-L.: Blockchained on-device federated learning. IEEE Commun. Lett. 24(6), 1279–1283 (2019)
Qian, Y., Hu, L., Chen, J., Guan, X., Hassan, M.M., Alelaiwi, A.: Privacy-aware service placement for mobile edge computing via federated learning. Inf. Sci. 505, 562–570 (2019)
Li, T., Sahu, A.K., Talwalkar, A., Smith, V.: Federated learning: challenges, methods, and future directions. IEEE Sig. Process. Mag. 37(3), 50–60 (2020)
Sattler, F., Wiedemann, S., Müller, K.R., Samek, W.: Robust and communication-efficient federated learning from non-iid data. IEEE Trans. Neural Netw. Learn. Syst. 31(9), 3400–3413 (2019)
Tran, N.H., Bao, W., Zomaya, A., Nguyen, M.N., Hong, C.S.: Federated learning over wireless networks: optimization model design and analysis. In: IEEE INFOCOM 2019-IEEE Conference on Computer Communications, pp. 1387–1395 (2020)
McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282 (2020)
Lyu, L., Yu, H., Yang, Q.: Threats to federated learning: a survey. arXiv preprint arXiv:2003.02133 (2020)
Wang, Z., Song, M., Zhang, Z., Song, Y., Wang, Q., Qi, H.: Beyond inferring class representatives: User-level privacy leakage from federated learning. In: IEEE INFOCOM 2019-IEEE Conference on Computer Communications [U+FF0C, pp. 2512–2520 (2019)
Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-iid data. arXiv preprint arXiv:1806.00582 (2018)
Mohri, M., Sivek, G., Suresh, A.T.: Agnostic federated learning. In: International Conference on Machine Learning, pp. 4615–4625 (2019)
Yurochkin, M., Agarwal, M., Ghosh, S., Greenewald, K., Hoang, N., Khazaeni, Y.: Bayesian nonparametric federated learning of neural networks. In: International Conference on Machine Learning, pp. 7252–7261 (2019)
Acknowledgments
We thank the anonymous reviewers for their help in improving our paper. This work was supported by Grant 2020YFB 1005402 from the National Key R&D Program of China.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 Springer Nature Switzerland AG
About this paper
Cite this paper
Xiao, W. et al. (2022). Fed-Tra: Improving Accuracy of Deep Learning Model on Non-iid in Federated Learning. In: Lai, Y., Wang, T., Jiang, M., Xu, G., Liang, W., Castiglione, A. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2021. Lecture Notes in Computer Science(), vol 13155. Springer, Cham. https://doi.org/10.1007/978-3-030-95384-3_49
Download citation
DOI: https://doi.org/10.1007/978-3-030-95384-3_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-95383-6
Online ISBN: 978-3-030-95384-3
eBook Packages: Computer ScienceComputer Science (R0)