Byzantine-resilient decentralized stochastic gradient descent

Decentralized learning has gained great popularity to improve learning efficiency and preserve data privacy. Each computing node makes equal contribution to collaboratively learn a Deep Learning model. The elimination of centralized Parameter Servers (PS) can effectively address many issues such as...

Full description

Saved in:
Bibliographic Details
Main Authors: GUO, Shangwei, ZHANG, Tianwei, YU, Han, XIE, Xiaofei, MA, Lei, XIANG, Tao, LIU, Yang
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7827
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8830
record_format dspace
spelling sg-smu-ink.sis_research-88302023-05-11T08:42:02Z Byzantine-resilient decentralized stochastic gradient descent GUO, Shangwei ZHANG, Tianwei YU, Han XIE, Xiaofei MA, Lei XIANG, Tao LIU, Yang Decentralized learning has gained great popularity to improve learning efficiency and preserve data privacy. Each computing node makes equal contribution to collaboratively learn a Deep Learning model. The elimination of centralized Parameter Servers (PS) can effectively address many issues such as privacy, performance bottleneck and single-point-failure. However, how to achieve Byzantine Fault Tolerance in decentralized learning systems is rarely explored, although this problem has been extensively studied in centralized systems. In this paper, we present an in-depth study towards the Byzantine resilience of decentralized learning systems with two contributions. First, from the adversarial perspective, we theoretically illustrate that Byzantine attacks are more dangerous and feasible in decentralized learning systems: even one malicious participant can arbitrarily alter the models of other participants by sending carefully crafted updates to its neighbors. Second, from the defense perspective, we propose Ubar, a novel algorithm to enhance decentralized learning with Byzantine Fault Tolerance. Specifically, Ubar provides a Uniform Byzantine-resilient Aggregation Rule for benign nodes to select the useful parameter updates and filter out the malicious ones in each training iteration. It guarantees that each benign node in a decentralized system can train a correct model under very strong Byzantine attacks with an arbitrary number of faulty nodes. We conduct extensive experiments on standard image classification tasks and the results indicate that Ubar can effectively defeat both simple and sophisticated Byzantine attacks with higher performance efficiency than existing solutions. 2022-06-01T07:00:00Z text https://ink.library.smu.edu.sg/sis_research/7827 info:doi/10.1109/TCSVT.2021.3116976 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Training servers learning systems distance learning computer aided instruction security fault tolerant systems decentralized learning stochastic gradient descent Byzantine attack Byzantine fault tolerance Artificial Intelligence and Robotics Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Training
servers
learning systems
distance learning
computer aided instruction
security
fault tolerant systems
decentralized learning
stochastic gradient descent
Byzantine attack
Byzantine fault tolerance
Artificial Intelligence and Robotics
Theory and Algorithms
spellingShingle Training
servers
learning systems
distance learning
computer aided instruction
security
fault tolerant systems
decentralized learning
stochastic gradient descent
Byzantine attack
Byzantine fault tolerance
Artificial Intelligence and Robotics
Theory and Algorithms
GUO, Shangwei
ZHANG, Tianwei
YU, Han
XIE, Xiaofei
MA, Lei
XIANG, Tao
LIU, Yang
Byzantine-resilient decentralized stochastic gradient descent
description Decentralized learning has gained great popularity to improve learning efficiency and preserve data privacy. Each computing node makes equal contribution to collaboratively learn a Deep Learning model. The elimination of centralized Parameter Servers (PS) can effectively address many issues such as privacy, performance bottleneck and single-point-failure. However, how to achieve Byzantine Fault Tolerance in decentralized learning systems is rarely explored, although this problem has been extensively studied in centralized systems. In this paper, we present an in-depth study towards the Byzantine resilience of decentralized learning systems with two contributions. First, from the adversarial perspective, we theoretically illustrate that Byzantine attacks are more dangerous and feasible in decentralized learning systems: even one malicious participant can arbitrarily alter the models of other participants by sending carefully crafted updates to its neighbors. Second, from the defense perspective, we propose Ubar, a novel algorithm to enhance decentralized learning with Byzantine Fault Tolerance. Specifically, Ubar provides a Uniform Byzantine-resilient Aggregation Rule for benign nodes to select the useful parameter updates and filter out the malicious ones in each training iteration. It guarantees that each benign node in a decentralized system can train a correct model under very strong Byzantine attacks with an arbitrary number of faulty nodes. We conduct extensive experiments on standard image classification tasks and the results indicate that Ubar can effectively defeat both simple and sophisticated Byzantine attacks with higher performance efficiency than existing solutions.
format text
author GUO, Shangwei
ZHANG, Tianwei
YU, Han
XIE, Xiaofei
MA, Lei
XIANG, Tao
LIU, Yang
author_facet GUO, Shangwei
ZHANG, Tianwei
YU, Han
XIE, Xiaofei
MA, Lei
XIANG, Tao
LIU, Yang
author_sort GUO, Shangwei
title Byzantine-resilient decentralized stochastic gradient descent
title_short Byzantine-resilient decentralized stochastic gradient descent
title_full Byzantine-resilient decentralized stochastic gradient descent
title_fullStr Byzantine-resilient decentralized stochastic gradient descent
title_full_unstemmed Byzantine-resilient decentralized stochastic gradient descent
title_sort byzantine-resilient decentralized stochastic gradient descent
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/7827
_version_ 1770576542822301696