SoteriaFL: A unified framework for private federated learning with communication compression
To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especiall...
Saved in:
Main Authors: | LI, Zhize, ZHAO, Haoyu, LI, Boyue, CHI, Yuejie |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8688 https://ink.library.smu.edu.sg/context/sis_research/article/9691/viewcontent/NeurIPS22_full_soteriafl.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
BEER: Fast O(1/T) rate for decentralized nonconvex optimization with communication compression
by: ZHAO, Haoyu, et al.
Published: (2022) -
Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
by: CHI, Yuejie, et al.
Published: (2023) -
DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
by: LI, Boyue, et al.
Published: (2022) -
Escaping saddle points in heterogeneous federated learning via distributed SGD with communication compression
by: CHEN, Sijin, et al.
Published: (2024) -
Faster rates for compressed federated learning with client-variance reduction
by: ZHAO, Haoyu, et al.
Published: (2024)