BEER: Fast O(1/T) rate for decentralized nonconvex optimization with communication compression
Communication efficiency has been widely recognized as the bottleneck for large-scale decentralized machine learning applications in multi-agent or federated environments. To tackle the communication bottleneck, there have been many efforts to design communication-compressed algorithms for decentral...
Saved in:
Main Authors: | ZHAO, Haoyu, LI, Boyue, LI, Zhize, RICHTARIK, Peter, CHI, Yuejie |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8687 https://ink.library.smu.edu.sg/context/sis_research/article/9690/viewcontent/NeurIPS22_full_beer.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
by: LI, Boyue, et al.
Published: (2022) -
SoteriaFL: A unified framework for private federated learning with communication compression
by: LI, Zhize, et al.
Published: (2022) -
CANITA: Faster rates for distributed convex optimization with communication compression
by: LI, Zhize, et al.
Published: (2021) -
PAGE: A simple and optimal probabilistic gradient estimator for nonconvex optimization
by: LI, Zhize, et al.
Published: (2021) -
Faster rates for compressed federated learning with client-variance reduction
by: ZHAO, Haoyu, et al.
Published: (2024)