Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"

Rationale for the tutorial: The proliferation of multi-agent environments in emerging applications such as internet-of-things (IoT), networked sensing, and autonomous systems leads to a flurry of activities on developing federated and decentralized optimization algorithms for training predictive mod...

Full description

Saved in:
Bibliographic Details
Main Authors: CHI, Yuejie, LI, Zhize
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8717
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9720
record_format dspace
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
spellingShingle Databases and Information Systems
CHI, Yuejie
LI, Zhize
Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
description Rationale for the tutorial: The proliferation of multi-agent environments in emerging applications such as internet-of-things (IoT), networked sensing, and autonomous systems leads to a flurry of activities on developing federated and decentralized optimization algorithms for training predictive models, particularly under the realm of federated learning (FL). The distinctive features of these large-scale distributed systems have posed unique challenges that are not well-captured by the classical distributed optimization framework, and therefore spurred a significant amount of recent algorithmic developments focusing on the following aspects:Resource efficiency: the ever-growing large scale and high dimensionality of the datasets necessitate the need to develop algorithms that perform well in a resource-efficient manner in terms of both communication and computation.Resiliency to heterogeneity: Data samples collected from different agents can be highly unbalanced and heterogeneous, where vanilla federated optimization algorithms (e.g. FedAvg) can converge very slowly or even diverge, and better algorithm designs are called for to handle the heterogeneity issue.Privacy preserving: While FL holds great promise of harnessing the inferential power of private data stored on a large number of distributed clients, these local data at clients often contain sensitive or proprietary information without consent to share. It is thus desirable for federated optimization algorithms to preserve privacy in a guaranteed manner. Our goal is to equip signal processing researchers with the core toolkits and recent advances of federated optimization and inspire the pursuit of further theory, algorithms, and applications from the signal processing community on this multi-disciplinary and fast-growing topic. Given the popularity of FL in various signal processing applications, we expect this tutorial will be very timely and attract a large audience. Last but not least, our theme on FL is closely related to building AI systems, and therefore fits very well with the conference’s theme on AI this year. Tutorial abstract and outline: The proposed tutorial will cover systematically recent advances in federated optimization that highlight algorithmic ideas that enable resource efficiency, resiliency, and privacy in both server-client and network settings, as well as address multiple FL paradigms including but not limited horizontal, vertical and personalized FL. In particular, the primary focus will be on the nonconvex setting, which is more important for modern machine learning applications. The structure of the tutorial is tentatively outlined below. It is worth emphasizing that the algorithms mentioned below are examples of what we intend to cover/illustrate and should not be taken as an exhaustive list.We will begin with an introduction to federated learning with its various popular variants, and discuss the unique challenges associated with federated optimization.Efficient federated optimization: we will discuss resource-efficient, and in particular, communication-efficient federated optimization algorithms, including algorithms that perform multiple local updates (which aim to reduce the number of communication rounds) and communication compression (which aim to reduce the communication cost per round).Resilient federated optimization: we will discuss the vulnerability of federated optimization in the presence of data heterogeneity, together with efficient algorithmic solutions to provably overcome these limitations.Private federated optimization: we will highlight the necessity of privacy guarantees and notions of privacy measures, followed by algorithm developments.
format text
author CHI, Yuejie
LI, Zhize
author_facet CHI, Yuejie
LI, Zhize
author_sort CHI, Yuejie
title Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
title_short Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
title_full Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
title_fullStr Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
title_full_unstemmed Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
title_sort tutorial: "advances in federated optimization: efficiency, resiliency, and privacy"
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/8717
_version_ 1814047474568921088
spelling sg-smu-ink.sis_research-97202024-04-04T09:14:25Z Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy" CHI, Yuejie LI, Zhize Rationale for the tutorial: The proliferation of multi-agent environments in emerging applications such as internet-of-things (IoT), networked sensing, and autonomous systems leads to a flurry of activities on developing federated and decentralized optimization algorithms for training predictive models, particularly under the realm of federated learning (FL). The distinctive features of these large-scale distributed systems have posed unique challenges that are not well-captured by the classical distributed optimization framework, and therefore spurred a significant amount of recent algorithmic developments focusing on the following aspects:Resource efficiency: the ever-growing large scale and high dimensionality of the datasets necessitate the need to develop algorithms that perform well in a resource-efficient manner in terms of both communication and computation.Resiliency to heterogeneity: Data samples collected from different agents can be highly unbalanced and heterogeneous, where vanilla federated optimization algorithms (e.g. FedAvg) can converge very slowly or even diverge, and better algorithm designs are called for to handle the heterogeneity issue.Privacy preserving: While FL holds great promise of harnessing the inferential power of private data stored on a large number of distributed clients, these local data at clients often contain sensitive or proprietary information without consent to share. It is thus desirable for federated optimization algorithms to preserve privacy in a guaranteed manner. Our goal is to equip signal processing researchers with the core toolkits and recent advances of federated optimization and inspire the pursuit of further theory, algorithms, and applications from the signal processing community on this multi-disciplinary and fast-growing topic. Given the popularity of FL in various signal processing applications, we expect this tutorial will be very timely and attract a large audience. Last but not least, our theme on FL is closely related to building AI systems, and therefore fits very well with the conference’s theme on AI this year. Tutorial abstract and outline: The proposed tutorial will cover systematically recent advances in federated optimization that highlight algorithmic ideas that enable resource efficiency, resiliency, and privacy in both server-client and network settings, as well as address multiple FL paradigms including but not limited horizontal, vertical and personalized FL. In particular, the primary focus will be on the nonconvex setting, which is more important for modern machine learning applications. The structure of the tutorial is tentatively outlined below. It is worth emphasizing that the algorithms mentioned below are examples of what we intend to cover/illustrate and should not be taken as an exhaustive list.We will begin with an introduction to federated learning with its various popular variants, and discuss the unique challenges associated with federated optimization.Efficient federated optimization: we will discuss resource-efficient, and in particular, communication-efficient federated optimization algorithms, including algorithms that perform multiple local updates (which aim to reduce the number of communication rounds) and communication compression (which aim to reduce the communication cost per round).Resilient federated optimization: we will discuss the vulnerability of federated optimization in the presence of data heterogeneity, together with efficient algorithmic solutions to provably overcome these limitations.Private federated optimization: we will highlight the necessity of privacy guarantees and notions of privacy measures, followed by algorithm developments. 2023-06-01T07:00:00Z text https://ink.library.smu.edu.sg/sis_research/8717 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems