Coded federated learning for communication-efficient edge computing: a survey
In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181266 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-181266 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1812662024-11-22T15:41:23Z Coded federated learning for communication-efficient edge computing: a survey Zhang, Yiqian Gao, Tianli Li, Congduan Tan, Chee Wei School of Electrical and Electronic Engineering School of Computer Science and Engineering Engineering Distributed machine learning Distributed computing In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources required for each worker. However, in distributed systems, the presence of slow machines, commonly known as stragglers, or failed links can lead to prolonged runtimes and diminished performance. This survey explores the application of coding techniques in DML and coded edge computing in the distributed system to enhance system speed, robustness, privacy, and more. Notably, the study delves into coding in Federated Learning (FL), a specialized distributed learning system. Coding involves introducing redundancy into the system and identifying multicast opportunities. There exists a tradeoff between computation and communication costs. The survey establishes that coding is a promising approach for building robust and secure distributed systems with low latency. Ministry of Education (MOE) Published version This work was supported in part by the National Science Foundation of China (NSFC) under Grant 62271514; in part by the Science, Technology and Innovation Commission of Shenzhen Municipality under Grant JCYJ20210324120002007 and Grant ZDSYS20210623091807023; in part by the Research Fund of State Key Laboratory of Public Big Data, Guizhou University under Grant PBD2023-01; and in part by the Ministry of Education, Singapore, under its Academic Research Fund Grant AcRF RG91/22 and NTU Startup (Yiqian Zhang and Tianli Gao contributed equally to this work.) 2024-11-20T05:01:05Z 2024-11-20T05:01:05Z 2024 Journal Article Zhang, Y., Gao, T., Li, C. & Tan, C. W. (2024). Coded federated learning for communication-efficient edge computing: a survey. IEEE Open Journal of the Communications Society, 5, 4098-4124. https://dx.doi.org/10.1109/OJCOMS.2024.3423362 2644-125X https://hdl.handle.net/10356/181266 10.1109/OJCOMS.2024.3423362 2-s2.0-85197505598 5 4098 4124 en RG91/22 IEEE Open Journal of the Communications Society © 2024 The Authors. This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering Distributed machine learning Distributed computing |
spellingShingle |
Engineering Distributed machine learning Distributed computing Zhang, Yiqian Gao, Tianli Li, Congduan Tan, Chee Wei Coded federated learning for communication-efficient edge computing: a survey |
description |
In the era of artificial intelligence and big data, the demand for data processing has surged, leading to larger datasets and computation capability. Distributed machine learning (DML) has been introduced to address this challenge by distributing tasks among multiple workers, reducing the resources required for each worker. However, in distributed systems, the presence of slow machines, commonly known as stragglers, or failed links can lead to prolonged runtimes and diminished performance. This survey explores the application of coding techniques in DML and coded edge computing in the distributed system to enhance system speed, robustness, privacy, and more. Notably, the study delves into coding in Federated Learning (FL), a specialized distributed learning system. Coding involves introducing redundancy into the system and identifying multicast opportunities. There exists a tradeoff between computation and communication costs. The survey establishes that coding is a promising approach for building robust and secure distributed systems with low latency. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Zhang, Yiqian Gao, Tianli Li, Congduan Tan, Chee Wei |
format |
Article |
author |
Zhang, Yiqian Gao, Tianli Li, Congduan Tan, Chee Wei |
author_sort |
Zhang, Yiqian |
title |
Coded federated learning for communication-efficient edge computing: a survey |
title_short |
Coded federated learning for communication-efficient edge computing: a survey |
title_full |
Coded federated learning for communication-efficient edge computing: a survey |
title_fullStr |
Coded federated learning for communication-efficient edge computing: a survey |
title_full_unstemmed |
Coded federated learning for communication-efficient edge computing: a survey |
title_sort |
coded federated learning for communication-efficient edge computing: a survey |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/181266 |
_version_ |
1816859064071618560 |