As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Developing lightweight training approaches for distributed machine learning is of key importance to deploy machine learning on Internet-of-Things (IoT) devices. Gossip SGD is attractive among distributed machine learning methods thanks to its applicability to various network topologies (irrespective of having a central entity) and simple parameters update strategy. However, it is known to be inefficient in non-homogeneous data distribution environments. Our work addresses this issue by introducing “parameters swapping” in the Gossip SGD during parameters update to let devices simply exchange their parameters. Our method, named Gossip Swap SGD, efficiently resolves a cause of insufficient accuracy or delayed convergence in parameters update for non-homogeneous data distribution. Our quantitative evaluation demonstrated that our method not only outperforms exiting methods in non-homogeneous data distribution but also has no degradation even in homogeneous data distribution.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.