Innovation compression for communication-efficient distributed optimization with linear convergence
Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. This paper proposes a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors,...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/170700 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-170700 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1707002023-09-26T06:01:44Z Innovation compression for communication-efficient distributed optimization with linear convergence Zhang, Jiaqi You, Keyou Xie, Lihua School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Distributed Optimization Linear Convergence Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. This paper proposes a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors, which are the differences between decision vectors and their estimates, COLD achieves linear convergence for a class of δ-contracted compressors, and we explicitly quantify how the compression affects the convergence rate. Interestingly, our results strictly improve existing results for the quantized consensus problem. Numerical experiments demonstrate the advantages of COLD under different compressors. This work was supported by Technology and Innovation Major Project of the Ministry of Science and Technology of China under Grant 2020AAA0108403, the National Natural Science Foundation of China under Grant no. 62033006 and Tsinghua University Initiative Scientific Research Program. 2023-09-26T03:14:59Z 2023-09-26T03:14:59Z 2023 Journal Article Zhang, J., You, K. & Xie, L. (2023). Innovation compression for communication-efficient distributed optimization with linear convergence. IEEE Transactions On Automatic Control, 1-8. https://dx.doi.org/10.1109/TAC.2023.3241771 0018-9286 https://hdl.handle.net/10356/170700 10.1109/TAC.2023.3241771 2-s2.0-85148447851 1 8 en IEEE Transactions on Automatic Control © 2023 IEEE. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Distributed Optimization Linear Convergence |
spellingShingle |
Engineering::Electrical and electronic engineering Distributed Optimization Linear Convergence Zhang, Jiaqi You, Keyou Xie, Lihua Innovation compression for communication-efficient distributed optimization with linear convergence |
description |
Information compression is essential to reduce communication cost in distributed optimization over peer-to-peer networks. This paper proposes a communication-efficient linearly convergent distributed (COLD) algorithm to solve strongly convex optimization problems. By compressing innovation vectors, which are the differences between decision vectors and their estimates, COLD achieves linear convergence for a class of δ-contracted compressors, and we explicitly quantify how the compression affects the convergence rate. Interestingly, our results strictly improve existing results for the quantized consensus problem. Numerical experiments demonstrate the advantages of COLD under different compressors. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Zhang, Jiaqi You, Keyou Xie, Lihua |
format |
Article |
author |
Zhang, Jiaqi You, Keyou Xie, Lihua |
author_sort |
Zhang, Jiaqi |
title |
Innovation compression for communication-efficient distributed optimization with linear convergence |
title_short |
Innovation compression for communication-efficient distributed optimization with linear convergence |
title_full |
Innovation compression for communication-efficient distributed optimization with linear convergence |
title_fullStr |
Innovation compression for communication-efficient distributed optimization with linear convergence |
title_full_unstemmed |
Innovation compression for communication-efficient distributed optimization with linear convergence |
title_sort |
innovation compression for communication-efficient distributed optimization with linear convergence |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/170700 |
_version_ |
1779156464802201600 |