Simultaneous-fault diagnosis considering time series with a deep learning transformer architecture for air handling units

An advanced deep learning-based method that employs transformer architecture is proposed to diagnose the simultaneous faults with time-series data. This method can be directly applied to transient data while maintaining the accuracy without a steady-state detector so that the fault can be diagnosed...

全面介紹

Saved in:
書目詳細資料
Main Authors: Wu, Bingjie, Cai, Wenjian, Cheng, Fanyong, Chen, Haoran
其他作者: School of Electrical and Electronic Engineering
格式: Article
語言:English
出版: 2022
主題:
在線閱讀:https://hdl.handle.net/10356/161886
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Nanyang Technological University
語言: English
實物特徵
總結:An advanced deep learning-based method that employs transformer architecture is proposed to diagnose the simultaneous faults with time-series data. This method can be directly applied to transient data while maintaining the accuracy without a steady-state detector so that the fault can be diagnosed in its early stage. The transformer architecture adopts a novel multi-head attention mechanism without involving any convolutional and recurrent layers as in conventional deep learning methods. The model has been verified by an on-site air handling unit with 6 single-fault cases, 7 simultaneous-fault cases, and normal operating conditions with satisfactory performances of test accuracy of 99.87%, Jaccard score of 99.94%, and F1 score of 99.95%. Besides, the attention distribution reveals the correlations between features to the corresponding fault. It is found that the length of the sliding window is key to the model performance, and a trade-off is made for the window length between the model performance and the diagnosis time. Based on the similar idea, another sequence-to-vector model based on the gated recurrent unit (GRU) is proposed and benchmarked with the transformer model. The results show that the transformer model outperforms the GRU model with a better Jaccard score and F1 score in less training time.