TravellingFL: communication efficient peer-to-peer federated learning

Peer-to-Peer federated learning is a distributed machine learning paradigm with a primary goal of learning a well-performing global model by collaboratively learning a shared model at different data hubs without the need of sharing data. Due to its immense practical applications, there is growing at...

Full description

Saved in:
Bibliographic Details
Main Authors: Gupta, Vansh, Luqman, Alka, Chattopadhyay, Nandish, Chattopadhyay, Anupam, Niyato, Dusit
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/173391
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Peer-to-Peer federated learning is a distributed machine learning paradigm with a primary goal of learning a well-performing global model by collaboratively learning a shared model at different data hubs without the need of sharing data. Due to its immense practical applications, there is growing attention towards various challenges of efficient federated learning including communication efficiency, assumptions on connectivity, data heterogeneity, enhanced privacy, etc. In this paper, we address the problem of dynamic network topologies in federated learning. We present a technique to help new participants in Peer-to-Peer federated learning reach best possible accuracy by leveraging learning at other devices in a communication efficient manner. We model the costs in federated learning and apply a graph theoretical framework to show that one can draw from a range of graph-based algorithms to construct an efficient communication algorithm on a connected network, thereby matching the inference efficiency of centralized federated learning. We conduct experiments with varied graph formations and sizes to validate our claims.