Tutorial: "Advances in federated optimization: Efficiency, resiliency, and privacy"
Rationale for the tutorial: The proliferation of multi-agent environments in emerging applications such as internet-of-things (IoT), networked sensing, and autonomous systems leads to a flurry of activities on developing federated and decentralized optimization algorithms for training predictive mod...
Saved in:
Main Authors: | , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8717 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Summary: | Rationale for the tutorial: The proliferation of multi-agent environments in emerging applications such as internet-of-things (IoT), networked sensing, and autonomous systems leads to a flurry of activities on developing federated and decentralized optimization algorithms for training predictive models, particularly under the realm of federated learning (FL). The distinctive features of these large-scale distributed systems have posed unique challenges that are not well-captured by the classical distributed optimization framework, and therefore spurred a significant amount of recent algorithmic developments focusing on the following aspects:Resource efficiency: the ever-growing large scale and high dimensionality of the datasets necessitate the need to develop algorithms that perform well in a resource-efficient manner in terms of both communication and computation.Resiliency to heterogeneity: Data samples collected from different agents can be highly unbalanced and heterogeneous, where vanilla federated optimization algorithms (e.g. FedAvg) can converge very slowly or even diverge, and better algorithm designs are called for to handle the heterogeneity issue.Privacy preserving: While FL holds great promise of harnessing the inferential power of private data stored on a large number of distributed clients, these local data at clients often contain sensitive or proprietary information without consent to share. It is thus desirable for federated optimization algorithms to preserve privacy in a guaranteed manner. Our goal is to equip signal processing researchers with the core toolkits and recent advances of federated optimization and inspire the pursuit of further theory, algorithms, and applications from the signal processing community on this multi-disciplinary and fast-growing topic. Given the popularity of FL in various signal processing applications, we expect this tutorial will be very timely and attract a large audience. Last but not least, our theme on FL is closely related to building AI systems, and therefore fits very well with the conference’s theme on AI this year. Tutorial abstract and outline: The proposed tutorial will cover systematically recent advances in federated optimization that highlight algorithmic ideas that enable resource efficiency, resiliency, and privacy in both server-client and network settings, as well as address multiple FL paradigms including but not limited horizontal, vertical and personalized FL. In particular, the primary focus will be on the nonconvex setting, which is more important for modern machine learning applications. The structure of the tutorial is tentatively outlined below. It is worth emphasizing that the algorithms mentioned below are examples of what we intend to cover/illustrate and should not be taken as an exhaustive list.We will begin with an introduction to federated learning with its various popular variants, and discuss the unique challenges associated with federated optimization.Efficient federated optimization: we will discuss resource-efficient, and in particular, communication-efficient federated optimization algorithms, including algorithms that perform multiple local updates (which aim to reduce the number of communication rounds) and communication compression (which aim to reduce the communication cost per round).Resilient federated optimization: we will discuss the vulnerability of federated optimization in the presence of data heterogeneity, together with efficient algorithmic solutions to provably overcome these limitations.Private federated optimization: we will highlight the necessity of privacy guarantees and notions of privacy measures, followed by algorithm developments. |
---|