Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data

Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, wh...

Full description

Saved in:
Bibliographic Details
Main Authors: CHEN, Zekai, YU, Shengxing, CHEN, Farong, WANG, Fuyi, LIU, Ximeng, DENG, Robert H.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9637
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10637
record_format dspace
spelling sg-smu-ink.sis_research-106372024-11-23T15:18:03Z Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data CHEN, Zekai YU, Shengxing CHEN, Farong WANG, Fuyi LIU, Ximeng DENG, Robert H. Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, which may lead to potential data leakage from uploading model updates. Despite intensive research on above-identical issues, existing approaches fail to balance robustness and privacy in FL. Furthermore, limiting model updates or iterative clustering tends to fall into local optimum problems in heterogeneous (Non-IID) scenarios. In this work, to address these deficiencies, we provide lightweight privacy-preserving cross-cluster federated learning (PrivCrFL) on Non-IID data, to trade off robustness and privacy in Non-IID settings. Our PrivCrFL exploits secure one-shot hierarchical clustering with cross-cluster shifting for optimizing sub-group convergences. Furthermore, we introduce intra-cluster learning and inter-cluster learning with separate aggregation for mutual learning between each group. We perform extensive experimental evaluations on three benchmark datasets and compare our results with state-of-the-art studies. The findings indicate that PrivCrFL offers a notable performance enhancement, with improvements ranging from 0.26% up arrow to 1.35% up arrow across different Non-IID settings. PrivCrFL also demonstrates a superior communication compression ratio in secure aggregation, outperforming current state-of-the-art works by 10.59%. 2024-11-01T07:00:00Z text https://ink.library.smu.edu.sg/sis_research/9637 info:doi/10.1109/TIFS.2024.3435476 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Distributed machine learning federated learning heterogeneity data privacy preserving Artificial Intelligence and Robotics Information Security
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Distributed machine learning
federated learning
heterogeneity data
privacy preserving
Artificial Intelligence and Robotics
Information Security
spellingShingle Distributed machine learning
federated learning
heterogeneity data
privacy preserving
Artificial Intelligence and Robotics
Information Security
CHEN, Zekai
YU, Shengxing
CHEN, Farong
WANG, Fuyi
LIU, Ximeng
DENG, Robert H.
Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
description Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, which may lead to potential data leakage from uploading model updates. Despite intensive research on above-identical issues, existing approaches fail to balance robustness and privacy in FL. Furthermore, limiting model updates or iterative clustering tends to fall into local optimum problems in heterogeneous (Non-IID) scenarios. In this work, to address these deficiencies, we provide lightweight privacy-preserving cross-cluster federated learning (PrivCrFL) on Non-IID data, to trade off robustness and privacy in Non-IID settings. Our PrivCrFL exploits secure one-shot hierarchical clustering with cross-cluster shifting for optimizing sub-group convergences. Furthermore, we introduce intra-cluster learning and inter-cluster learning with separate aggregation for mutual learning between each group. We perform extensive experimental evaluations on three benchmark datasets and compare our results with state-of-the-art studies. The findings indicate that PrivCrFL offers a notable performance enhancement, with improvements ranging from 0.26% up arrow to 1.35% up arrow across different Non-IID settings. PrivCrFL also demonstrates a superior communication compression ratio in secure aggregation, outperforming current state-of-the-art works by 10.59%.
format text
author CHEN, Zekai
YU, Shengxing
CHEN, Farong
WANG, Fuyi
LIU, Ximeng
DENG, Robert H.
author_facet CHEN, Zekai
YU, Shengxing
CHEN, Farong
WANG, Fuyi
LIU, Ximeng
DENG, Robert H.
author_sort CHEN, Zekai
title Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
title_short Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
title_full Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
title_fullStr Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
title_full_unstemmed Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
title_sort lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/9637
_version_ 1816859174361890816