Distribution-balanced federated learning for fault identification of power lines
The state-of-the-art centralized machine learning applied to fault identification trains the collected data from edge devices on the cloud server due to the limitation of computing resources on edge. However, data leakage possibility increases considerably when sharing data with other devices on the...
Saved in:
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/172727 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-172727 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1727272023-12-18T04:47:54Z Distribution-balanced federated learning for fault identification of power lines Wang, Tianjing Gooi, Hoay Beng School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Fault Identification Federated Learning The state-of-the-art centralized machine learning applied to fault identification trains the collected data from edge devices on the cloud server due to the limitation of computing resources on edge. However, data leakage possibility increases considerably when sharing data with other devices on the cloud server, while training performance may degrade without data sharing. The study proposes a federated fault identification scheme, named DBFed-LSTM, by combining the distribution-balanced federated learning with the attention-based bidirectional long short-term memory, which can efficiently transfer training processes from the cloud server to edge devices. Under data privacy protections, local devices and the cloud server are specialized for storage and calculation as well as for updating the global model of learning vital time-frequency characteristics, respectively. Given that different device data for monitoring a small probability event are generally non-independent identically distributed (non-IID), a global-model pre-training method and improved focal loss are accordingly proposed. It is verified by the case study that the DBFed-LSTM can be effectively implemented to challenge centralized training with data sharing while preserving privacy and alleviating cloud server computation pressure even for non-IID data. Furthermore, it represents a much preferable performance and robust model to centralized training without data sharing. Agency for Science, Technology and Research (A*STAR) This research is supported by the Agency for Science, Technology and Research (A*STAR), Singapore under its Singapore–Germany Academic Industry (2+2) International Collaboration under Grant A1990b0060. 2023-12-18T04:47:54Z 2023-12-18T04:47:54Z 2023 Journal Article Wang, T. & Gooi, H. B. (2023). Distribution-balanced federated learning for fault identification of power lines. IEEE Transactions On Power Systems, 3267463-. https://dx.doi.org/10.1109/TPWRS.2023.3267463 0885-8950 https://hdl.handle.net/10356/172727 10.1109/TPWRS.2023.3267463 2-s2.0-85153473201 3267463 en A1990b0060 IEEE Transactions on Power Systems © 2023 IEEE. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Fault Identification Federated Learning |
spellingShingle |
Engineering::Electrical and electronic engineering Fault Identification Federated Learning Wang, Tianjing Gooi, Hoay Beng Distribution-balanced federated learning for fault identification of power lines |
description |
The state-of-the-art centralized machine learning applied to fault identification trains the collected data from edge devices on the cloud server due to the limitation of computing resources on edge. However, data leakage possibility increases considerably when sharing data with other devices on the cloud server, while training performance may degrade without data sharing. The study proposes a federated fault identification scheme, named DBFed-LSTM, by combining the distribution-balanced federated learning with the attention-based bidirectional long short-term memory, which can efficiently transfer training processes from the cloud server to edge devices. Under data privacy protections, local devices and the cloud server are specialized for storage and calculation as well as for updating the global model of learning vital time-frequency characteristics, respectively. Given that different device data for monitoring a small probability event are generally non-independent identically distributed (non-IID), a global-model pre-training method and improved focal loss are accordingly proposed. It is verified by the case study that the DBFed-LSTM can be effectively implemented to challenge centralized training with data sharing while preserving privacy and alleviating cloud server computation pressure even for non-IID data. Furthermore, it represents a much preferable performance and robust model to centralized training without data sharing. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Wang, Tianjing Gooi, Hoay Beng |
format |
Article |
author |
Wang, Tianjing Gooi, Hoay Beng |
author_sort |
Wang, Tianjing |
title |
Distribution-balanced federated learning for fault identification of power lines |
title_short |
Distribution-balanced federated learning for fault identification of power lines |
title_full |
Distribution-balanced federated learning for fault identification of power lines |
title_fullStr |
Distribution-balanced federated learning for fault identification of power lines |
title_full_unstemmed |
Distribution-balanced federated learning for fault identification of power lines |
title_sort |
distribution-balanced federated learning for fault identification of power lines |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/172727 |
_version_ |
1787136511276220416 |