SoteriaFL: A unified framework for private federated learning with communication compression

To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especiall...

Full description

Saved in:
Bibliographic Details
Main Authors: LI, Zhize, ZHAO, Haoyu, LI, Boyue, CHI, Yuejie
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8688
https://ink.library.smu.edu.sg/context/sis_research/article/9691/viewcontent/NeurIPS22_full_soteriafl.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9691
record_format dspace
spelling sg-smu-ink.sis_research-96912024-03-28T08:45:43Z SoteriaFL: A unified framework for private federated learning with communication compression LI, Zhize ZHAO, Haoyu LI, Boyue CHI, Yuejie To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especially at the client level, is another important desideratum that has not been addressed simultaneously in the presence of advanced communication compression techniques yet. In this paper, we propose a unified framework that enhances the communication efficiency of private federated learning with communication compression. Exploiting both general compression operators and local differential privacy, we first examine a simple algorithm that applies compression directly to differentially-private stochastic gradient descent, and identify its limitations. We then propose a unified framework SoteriaFL for private federated learning, which accommodates a general family of local gradient estimators including popular stochastic variance-reduced gradient methods and the state-of-the-art shifted compression scheme. We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity, where SoteraFL is shown to achieve better communication complexity without sacrificing privacy nor utility than other private federated learning algorithms without communication compression. 2022-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8688 https://ink.library.smu.edu.sg/context/sis_research/article/9691/viewcontent/NeurIPS22_full_soteriafl.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
spellingShingle Databases and Information Systems
LI, Zhize
ZHAO, Haoyu
LI, Boyue
CHI, Yuejie
SoteriaFL: A unified framework for private federated learning with communication compression
description To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especially at the client level, is another important desideratum that has not been addressed simultaneously in the presence of advanced communication compression techniques yet. In this paper, we propose a unified framework that enhances the communication efficiency of private federated learning with communication compression. Exploiting both general compression operators and local differential privacy, we first examine a simple algorithm that applies compression directly to differentially-private stochastic gradient descent, and identify its limitations. We then propose a unified framework SoteriaFL for private federated learning, which accommodates a general family of local gradient estimators including popular stochastic variance-reduced gradient methods and the state-of-the-art shifted compression scheme. We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity, where SoteraFL is shown to achieve better communication complexity without sacrificing privacy nor utility than other private federated learning algorithms without communication compression.
format text
author LI, Zhize
ZHAO, Haoyu
LI, Boyue
CHI, Yuejie
author_facet LI, Zhize
ZHAO, Haoyu
LI, Boyue
CHI, Yuejie
author_sort LI, Zhize
title SoteriaFL: A unified framework for private federated learning with communication compression
title_short SoteriaFL: A unified framework for private federated learning with communication compression
title_full SoteriaFL: A unified framework for private federated learning with communication compression
title_fullStr SoteriaFL: A unified framework for private federated learning with communication compression
title_full_unstemmed SoteriaFL: A unified framework for private federated learning with communication compression
title_sort soteriafl: a unified framework for private federated learning with communication compression
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/8688
https://ink.library.smu.edu.sg/context/sis_research/article/9691/viewcontent/NeurIPS22_full_soteriafl.pdf
_version_ 1795302173199826944