Open Set Fault Diagnosis via Supervised Contrastive Learning With Negative Out of Distribution Data

Open Set Fault Diagnosis via Supervised Contrastive Learning With Negative Out of Distribution Data

Abstract:

Fault diagnosis in an open world refers to the diagnosis tasks that need to cope with previously unknown faults in the online stage. It faces a great challenge yet to be addressed—that is, the online data of unknown faults may be classified as normal samples with a high probability. In this article, we develop an effective solution for this challenge by using supervised contrastive learning to learn a discriminative and compact embedding for the known normal situation and fault situations. Specifically, in addition to contrasting a given sample with other instances as is the case in conventional contrastive learning methods, our training scheme contrasts the normal samples with negative augmentations of themselves. The negative out-of-distribution data is generated by the Soft Brownian Offset sampling method to simulate the previously unknown faults. Computational experiments are conducted on the Tennessee Eastman Process benchmark dataset and a practical plasma etching process dataset. The proposed method achieves significant improvement compared with four existing methods under three open-set fault diagnosis circumstances, i.e., balanced open-set fault diagnosis, imbalanced fault diagnosis, and few-shot fault diagnosis. This demonstrates its great potentials in real world fault diagnosis applications.