DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization

Emerging applications in multiagent environments such as internet-of-things, networked sensing, autonomous systems, and federated learning, call for decentralized algorithms for finite-sum optimizations that are resource efficient in terms of both computation and communication. In this paper, we con...

Full description

Saved in:
Bibliographic Details
Main Authors: LI, Boyue, LI, Zhize, CHI, Yuejie
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8691
https://ink.library.smu.edu.sg/context/sis_research/article/9694/viewcontent/SIMODS22_destress.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9694
record_format dspace
spelling sg-smu-ink.sis_research-96942024-03-28T08:43:48Z DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization LI, Boyue LI, Zhize CHI, Yuejie Emerging applications in multiagent environments such as internet-of-things, networked sensing, autonomous systems, and federated learning, call for decentralized algorithms for finite-sum optimizations that are resource efficient in terms of both computation and communication. In this paper, we consider the prototypical setting where the agents work collaboratively to minimize the sum of local loss functions by only communicating with their neighbors over a predetermined network topology. We develop a new algorithm, called DEcentralized STochastic REcurSive gradient methodS (DESTRESS) for nonconvex finite-sum optimization, which matches the optimal incremental first-order oracle complexity of centralized algorithms for finding first-order stationary points, while maintaining communication efficiency. Detailed theoretical and numerical comparisons corroborate that the resource efficiencies of DESTRESS improve upon prior decentralized algorithms over a wide range of parameter regimes. DESTRESS leverages several key algorithm design ideas including stochastic recursive gradient updates with minibatches for local computation, gradient tracking with extra mixing (i.e., multiple gossiping rounds) for periteration communication, together with careful choices of hyperparameters and new analysis frameworks to provably achieve a desirable computation-communication trade-off. 2022-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8691 info:doi/10.1137/21M1450677 https://ink.library.smu.edu.sg/context/sis_research/article/9694/viewcontent/SIMODS22_destress.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University decentralized optimization nonconvex finite-sum optimization stochastic recursive gradient methods Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic decentralized optimization
nonconvex finite-sum optimization
stochastic recursive gradient methods
Databases and Information Systems
spellingShingle decentralized optimization
nonconvex finite-sum optimization
stochastic recursive gradient methods
Databases and Information Systems
LI, Boyue
LI, Zhize
CHI, Yuejie
DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
description Emerging applications in multiagent environments such as internet-of-things, networked sensing, autonomous systems, and federated learning, call for decentralized algorithms for finite-sum optimizations that are resource efficient in terms of both computation and communication. In this paper, we consider the prototypical setting where the agents work collaboratively to minimize the sum of local loss functions by only communicating with their neighbors over a predetermined network topology. We develop a new algorithm, called DEcentralized STochastic REcurSive gradient methodS (DESTRESS) for nonconvex finite-sum optimization, which matches the optimal incremental first-order oracle complexity of centralized algorithms for finding first-order stationary points, while maintaining communication efficiency. Detailed theoretical and numerical comparisons corroborate that the resource efficiencies of DESTRESS improve upon prior decentralized algorithms over a wide range of parameter regimes. DESTRESS leverages several key algorithm design ideas including stochastic recursive gradient updates with minibatches for local computation, gradient tracking with extra mixing (i.e., multiple gossiping rounds) for periteration communication, together with careful choices of hyperparameters and new analysis frameworks to provably achieve a desirable computation-communication trade-off.
format text
author LI, Boyue
LI, Zhize
CHI, Yuejie
author_facet LI, Boyue
LI, Zhize
CHI, Yuejie
author_sort LI, Boyue
title DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
title_short DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
title_full DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
title_fullStr DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
title_full_unstemmed DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
title_sort destress: computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/8691
https://ink.library.smu.edu.sg/context/sis_research/article/9694/viewcontent/SIMODS22_destress.pdf
_version_ 1795302174186536960