Personalized federated learning with dynamic clustering and model distillation
Federated learning is a distributed machine learning technique that allows various data sources to work together to train models while keeping their raw data private. However, federated learning faces many challenges when dealing with non-independent and identically distributed (Non-IID) data, espec...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2025
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181935 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-181935 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1819352025-01-03T15:46:11Z Personalized federated learning with dynamic clustering and model distillation Bao, Junyan Tay Wee Peng School of Electrical and Electronic Engineering wptay@ntu.edu.sg Computer and Information Science Federated learning Hierarchical clustering Knowledge distillation Federated learning is a distributed machine learning technique that allows various data sources to work together to train models while keeping their raw data private. However, federated learning faces many challenges when dealing with non-independent and identically distributed (Non-IID) data, especially the problem of data heterogeneity, which can significantly degrade model performance. To address this challenge, we propose a new algorithm for personalized federated learning, known as pfedCluster. The core of the pfedCluster algorithm is to dynamically cluster clients using hierarchical tree clustering, which ensures minimal intra-cluster distance and maximal inter-cluster distance, thus optimizing the clustering effect. Additionally, the algorithm facilitates knowledge transfer between clusters through knowledge distillation, further enhancing model performance. This method improves model personalization by dynamically adjusting the clustering structure to suit varying data distributions. Experimental results show that pfedCluster effectively improves model performance on MNIST and CIFAR-10 datasets, demonstrating significant advantages in dealing with data heterogeneity compared to traditional federated learning algorithms. Our code is at https://github.com/NtuEEEJackie/pFedCluster. Master's degree 2025-01-03T00:45:21Z 2025-01-03T00:45:21Z 2024 Thesis-Master by Coursework Bao, J. (2024). Personalized federated learning with dynamic clustering and model distillation. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/181935 https://hdl.handle.net/10356/181935 en application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Computer and Information Science Federated learning Hierarchical clustering Knowledge distillation |
spellingShingle |
Computer and Information Science Federated learning Hierarchical clustering Knowledge distillation Bao, Junyan Personalized federated learning with dynamic clustering and model distillation |
description |
Federated learning is a distributed machine learning technique that allows various data sources to work together to train models while keeping their raw data private. However, federated learning faces many challenges when dealing with non-independent and identically distributed (Non-IID) data, especially the problem of data heterogeneity, which can significantly degrade model performance. To address this challenge, we propose a new algorithm for personalized federated learning, known as pfedCluster. The core of the pfedCluster algorithm is to dynamically cluster clients using hierarchical tree clustering, which ensures minimal intra-cluster distance and maximal inter-cluster distance, thus optimizing the clustering effect. Additionally, the algorithm facilitates knowledge transfer between clusters through knowledge distillation, further enhancing model performance. This method improves model personalization by dynamically adjusting the clustering structure to suit varying data distributions. Experimental results show that pfedCluster effectively improves model performance on MNIST and CIFAR-10 datasets, demonstrating significant advantages in dealing with data heterogeneity compared to traditional federated learning algorithms. Our code is at https://github.com/NtuEEEJackie/pFedCluster. |
author2 |
Tay Wee Peng |
author_facet |
Tay Wee Peng Bao, Junyan |
format |
Thesis-Master by Coursework |
author |
Bao, Junyan |
author_sort |
Bao, Junyan |
title |
Personalized federated learning with dynamic clustering and model distillation |
title_short |
Personalized federated learning with dynamic clustering and model distillation |
title_full |
Personalized federated learning with dynamic clustering and model distillation |
title_fullStr |
Personalized federated learning with dynamic clustering and model distillation |
title_full_unstemmed |
Personalized federated learning with dynamic clustering and model distillation |
title_sort |
personalized federated learning with dynamic clustering and model distillation |
publisher |
Nanyang Technological University |
publishDate |
2025 |
url |
https://hdl.handle.net/10356/181935 |
_version_ |
1821237106152833024 |