Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units
An advanced deep learning-based method that employs transformer architecture is proposed to diagnose the simultaneous faults with time-series data. This method can be directly applied to transient data while maintaining the accuracy without a steady-state detector so that the fault can be diagnosed...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/161886 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-161886 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1618862022-09-23T02:37:18Z Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units Wu, Bingjie Cai, Wenjian Cheng, Fanyong Chen, Haoran School of Electrical and Electronic Engineering SJ-NTU Corporate Lab Engineering::Electrical and electronic engineering Fault Diagnosis Transformer Architecture An advanced deep learning-based method that employs transformer architecture is proposed to diagnose the simultaneous faults with time-series data. This method can be directly applied to transient data while maintaining the accuracy without a steady-state detector so that the fault can be diagnosed in its early stage. The transformer architecture adopts a novel multi-head attention mechanism without involving any convolutional and recurrent layers as in conventional deep learning methods. The model has been verified by an on-site air handling unit with 6 single-fault cases, 7 simultaneous-fault cases, and normal operating conditions with satisfactory performances of test accuracy of 99.87%, Jaccard score of 99.94%, and F1 score of 99.95%. Besides, the attention distribution reveals the correlations between features to the corresponding fault. It is found that the length of the sliding window is key to the model performance, and a trade-off is made for the window length between the model performance and the diagnosis time. Based on the similar idea, another sequence-to-vector model based on the gated recurrent unit (GRU) is proposed and benchmarked with the transformer model. The results show that the transformer model outperforms the GRU model with a better Jaccard score and F1 score in less training time. 2022-09-23T02:37:18Z 2022-09-23T02:37:18Z 2022 Journal Article Wu, B., Cai, W., Cheng, F. & Chen, H. (2022). Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units. Energy and Buildings, 257, 111608-. https://dx.doi.org/10.1016/j.enbuild.2021.111608 0378-7788 https://hdl.handle.net/10356/161886 10.1016/j.enbuild.2021.111608 2-s2.0-85121618060 257 111608 en Energy and Buildings © 2021 Elsevier B.V. All rights reserved. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Fault Diagnosis Transformer Architecture |
spellingShingle |
Engineering::Electrical and electronic engineering Fault Diagnosis Transformer Architecture Wu, Bingjie Cai, Wenjian Cheng, Fanyong Chen, Haoran Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
description |
An advanced deep learning-based method that employs transformer architecture is proposed to diagnose the simultaneous faults with time-series data. This method can be directly applied to transient data while maintaining the accuracy without a steady-state detector so that the fault can be diagnosed in its early stage. The transformer architecture adopts a novel multi-head attention mechanism without involving any convolutional and recurrent layers as in conventional deep learning methods. The model has been verified by an on-site air handling unit with 6 single-fault cases, 7 simultaneous-fault cases, and normal operating conditions with satisfactory performances of test accuracy of 99.87%, Jaccard score of 99.94%, and F1 score of 99.95%. Besides, the attention distribution reveals the correlations between features to the corresponding fault. It is found that the length of the sliding window is key to the model performance, and a trade-off is made for the window length between the model performance and the diagnosis time. Based on the similar idea, another sequence-to-vector model based on the gated recurrent unit (GRU) is proposed and benchmarked with the transformer model. The results show that the transformer model outperforms the GRU model with a better Jaccard score and F1 score in less training time. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Wu, Bingjie Cai, Wenjian Cheng, Fanyong Chen, Haoran |
format |
Article |
author |
Wu, Bingjie Cai, Wenjian Cheng, Fanyong Chen, Haoran |
author_sort |
Wu, Bingjie |
title |
Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
title_short |
Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
title_full |
Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
title_fullStr |
Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
title_full_unstemmed |
Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
title_sort |
simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units |
publishDate |
2022 |
url |
https://hdl.handle.net/10356/161886 |
_version_ |
1745574653196238848 |