Personalized federated learning with dynamic clustering and model distillation
Federated learning is a distributed machine learning technique that allows various data sources to work together to train models while keeping their raw data private. However, federated learning faces many challenges when dealing with non-independent and identically distributed (Non-IID) data, espec...
Saved in:
主要作者: | Bao, Junyan |
---|---|
其他作者: | Tay Wee Peng |
格式: | Thesis-Master by Coursework |
語言: | English |
出版: |
Nanyang Technological University
2025
|
主題: | |
在線閱讀: | https://hdl.handle.net/10356/181935 |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
機構: | Nanyang Technological University |
語言: | English |
相似書籍
-
An empirical study of the inherent resistance of knowledge distillation based federated learning to targeted poisoning attacks
由: He, Weiyang, et al.
出版: (2024) -
Peer-to-peer federated learning
由: Sim, Nicholas Yong Yue
出版: (2024) -
FedART: A neural model integrating federated learning and adaptive resonance theory
由: PATERIA, Shubham, et al.
出版: (2025) -
Effective intrusion detection in heterogeneous Internet-of-Things networks via ensemble knowledge distillation-based federated learning
由: Shen, Jiyuan, et al.
出版: (2024) -
Edge-computing-based knowledge distillation and multitask learning for partial discharge recognition
由: Ji, Jinsheng, et al.
出版: (2024)