A Mask Self Supervised Learning Based Transformer for Bearing Fault Diagnosis With Limited Labeled S

A Mask Self Supervised Learning Based Transformer for Bearing Fault Diagnosis With Limited Labeled S

Abstract:

In recent years, transformer has become an effective tool for fault diagnosis, but it has been shown that a sufficient amount of labeled data is usually required to train a transformer model. However, a few labeled data can be obtained in the actual industrial process, and labeling a large quantity of training samples is costly. To reduce the demand for training labeled samples, this article proposes a mask self-supervised learning-based transformer (MSFormer) for bearing fault diagnosis of multistage centrifugal fans in petrochemical units under the condition of limited samples. In mask self-supervised learning (SSL), unlabeled samples can be used to mine robust representations of fault signals and potential relationships between subsequences to obtain a pretrained model with well-generalized parameters. Then, a few labeled samples are utilized to fine-tune by supervised learning to enable MSFormer the discrimination ability to identify different bearing fault types. The effectiveness of the proposed method is fully validated on the multistage centrifugal fan dataset and the Case Western Reserve University (CWRU) motor bearing dataset. The experimental results demonstrate that MSFormer is effective in reducing the number of labeled training samples, and compared to state-of-the-art methods, MSFormer has superior diagnosis performance under the condition of limited labeled samples.