A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection

Zhichao Zhu, Feiyang Chen, Lei Ni, Haitao Bian, Juncheng Jiang, Zhiquan Chen

科研成果: 期刊稿件文章同行评审

摘要

Fault detection and diagnosis (FDD) is an essential tool to ensure safety in chemical industries, and nowadays, many reconstruction-based deep learning methods are active in fault detection. However, many algorithms still suffer from not ideal actual performance. Inspired by the core mechanism of Transformer and large kernel convolution, this paper proposes a novel model combining variate-centric Transformer with large kernel temporal convolution. Variate-centric Transformer depends on self-attention to capture the multivariate correlations of input data, and large kernel temporal convolution collects period information to summarize temporal features. A benchmark dataset Tennessee Eastman process (TEP) and experiment data from the microreactor process are used to test the performance of fault detection. Compared with other reconstruction-based methods, results demonstrate that our model achieves a higher fault detection rate and a lower detection latency, and shows a significant potential for process safety.

源语言英语
文章编号108762
期刊Computers and Chemical Engineering
188
DOI
出版状态已出版 - 9月 2024

指纹

探究 'A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection' 的科研主题。它们共同构成独一无二的指纹。

引用此