Lightweight privacy-preserving cross-cluster federated learning with heterogeneous data
Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, wh...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2024
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9637 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Summary: | Federated Learning (FL) eliminates data silos that hinder digital transformation while training a shared global model collaboratively. However, training a global model in the context of FL has been highly susceptible to heterogeneity and privacy concerns due to discrepancies in data distribution, which may lead to potential data leakage from uploading model updates. Despite intensive research on above-identical issues, existing approaches fail to balance robustness and privacy in FL. Furthermore, limiting model updates or iterative clustering tends to fall into local optimum problems in heterogeneous (Non-IID) scenarios. In this work, to address these deficiencies, we provide lightweight privacy-preserving cross-cluster federated learning (PrivCrFL) on Non-IID data, to trade off robustness and privacy in Non-IID settings. Our PrivCrFL exploits secure one-shot hierarchical clustering with cross-cluster shifting for optimizing sub-group convergences. Furthermore, we introduce intra-cluster learning and inter-cluster learning with separate aggregation for mutual learning between each group. We perform extensive experimental evaluations on three benchmark datasets and compare our results with state-of-the-art studies. The findings indicate that PrivCrFL offers a notable performance enhancement, with improvements ranging from 0.26% up arrow to 1.35% up arrow across different Non-IID settings. PrivCrFL also demonstrates a superior communication compression ratio in secure aggregation, outperforming current state-of-the-art works by 10.59%. |
---|