Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3597062.3597280acmconferencesArticle/Chapter ViewAbstractPublication PagesmobisysConference Proceedingsconference-collections
research-article
Open access

The effect of network topologies on fully decentralized learning: a preliminary investigation

Published: 21 June 2023 Publication History
  • Get Citation Alerts
  • Abstract

    In a decentralized machine learning system, data is typically partitioned among multiple devices or nodes, each of which trains a local model using its own data. These local models are then shared and combined to create a global model that can make accurate predictions on new data. In this paper, we start exploring the role of the network topology connecting nodes on the performance of a Machine Learning model trained through direct collaboration between nodes. We investigate how different types of topologies impact the "spreading of knowledge", i.e., the ability of nodes to incorporate in their local model the knowledge derived by learning patterns in data available in other nodes across the networks. Specifically, we highlight the different roles in this process of more or less connected nodes (hubs and leaves), as well as that of macroscopic network properties (primarily, degree distribution and modularity). Among others, we show that, while it is known that even weak connectivity among network components is sufficient for information spread, it may not be sufficient for knowledge spread. More intuitively, we also find that hubs have a more significant role than leaves in spreading knowledge, although this manifests itself not only for heavy-tailed distributions but also when "hubs" have only moderately more connections than leaves. Finally, we show that tightly knit communities severely hinder knowledge spread.

    References

    [1]
    R. Albert and A.-L. Barabási. Statistical mechanics of complex networks. Rev. Mod. Phys., 74(1):47, 2002.
    [2]
    P. Erdős, A. Rényi, et al. On the evolution of random graphs. Publ. Math. Inst. Hung. Acad. Sci, 5(1):17--60, 1960.
    [3]
    A. Lalitha, X. Wang, O. Kilinc, Y. Lu, T. Javidi, and F. Koushanfar. Decentralized bayesian learning over graphs. arXiv, 2019.
    [4]
    C. Lee and D. J. Wilkinson. A review of stochastic block models and extensions for graph clustering. Appl. Netw. Sci., 4(1):1--50, 2019.
    [5]
    H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. Agüera y Arcas. Communication-efficient learning of deep networks from decentralized data. In AISTATS'17, 2017.
    [6]
    A. G. Roy, S. Siddiqui, S. Pölsterl, N. Navab, and C. Wachinger. BrainTorrent: A Peer-to-Peer Environment for Decentralized Federated Learning. arXiv, pages 1--9, 2019.
    [7]
    S. Savazzi, M. Nicoli, and V. Rampa. Federated Learning With Cooperating Devices: A Consensus Approach for Massive IoT Networks. IEEE Internet of Things Journal, 7(5):4641--4654, 2020.
    [8]
    T. Sun, D. Li, and B. Wang. Decentralized federated averaging. IEEE Trans. Pattern Anal. Mach. Intell., 45(04):4289--4301, 2023.

    Cited By

    View all
    • (2024)Impact of network topology on the performance of Decentralized Federated LearningComputer Networks10.1016/j.comnet.2024.110681(110681)Online publication date: Aug-2024
    • (2023)Exploring the Impact of Disrupted Peer-to-Peer Communications on Fully Decentralized Learning in Disaster Scenarios2023 International Conference on Information and Communication Technologies for Disaster Management (ICT-DM)10.1109/ICT-DM58371.2023.10286953(1-6)Online publication date: 13-Sep-2023

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    NetAISys '23: Proceedings of the 1st International Workshop on Networked AI Systems
    June 2023
    43 pages
    ISBN:9798400702129
    DOI:10.1145/3597062
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 June 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. decentralized learning
    2. graph topologies
    3. non-IID data

    Qualifiers

    • Research-article

    Conference

    NetAISys '23
    Sponsor:

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)109
    • Downloads (Last 6 weeks)21
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Impact of network topology on the performance of Decentralized Federated LearningComputer Networks10.1016/j.comnet.2024.110681(110681)Online publication date: Aug-2024
    • (2023)Exploring the Impact of Disrupted Peer-to-Peer Communications on Fully Decentralized Learning in Disaster Scenarios2023 International Conference on Information and Communication Technologies for Disaster Management (ICT-DM)10.1109/ICT-DM58371.2023.10286953(1-6)Online publication date: 13-Sep-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media