A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection

Zhichao Zhu, Feiyang Chen, Lei Ni, Haitao Bian, Juncheng Jiang, Zhiquan Chen

Research output: Contribution to journalArticlepeer-review

Abstract

Fault detection and diagnosis (FDD) is an essential tool to ensure safety in chemical industries, and nowadays, many reconstruction-based deep learning methods are active in fault detection. However, many algorithms still suffer from not ideal actual performance. Inspired by the core mechanism of Transformer and large kernel convolution, this paper proposes a novel model combining variate-centric Transformer with large kernel temporal convolution. Variate-centric Transformer depends on self-attention to capture the multivariate correlations of input data, and large kernel temporal convolution collects period information to summarize temporal features. A benchmark dataset Tennessee Eastman process (TEP) and experiment data from the microreactor process are used to test the performance of fault detection. Compared with other reconstruction-based methods, results demonstrate that our model achieves a higher fault detection rate and a lower detection latency, and shows a significant potential for process safety.

Original languageEnglish
Article number108762
JournalComputers and Chemical Engineering
Volume188
DOIs
StatePublished - Sep 2024

Keywords

  • Chemical process
  • Fault detection
  • Large kernel temporal convolution
  • Reconstruction method
  • Transformer

Fingerprint

Dive into the research topics of 'A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection'. Together they form a unique fingerprint.

Cite this