Interinstance and Intratemporal Self Supervised Learning With Few Labeled Data for Fault Diagnosis

Interinstance and Intratemporal Self Supervised Learning With Few Labeled Data for Fault Diagnosis

Abstract:

Recent researches on intelligent fault diagnosis algorithms can achieve great progress. However, considering the practical scenarios, the amount of labeled data is insufficient in face of the difficulty of data annotation, which would raise the risk of overfitting and hinder the model from its industrial applications. To address this problem, in this article, we propose an interinstance and intratemporal self-supervised learning framework, where self-supervised learning on massive unlabeled data is integrated with supervised learning on few labeled data to enrich the capacity of learnable data. Specifically, we design a time-amplitude signal augmentation technique and conduct interinstance transform-consistency learning to obtain domain-invariant features. Meanwhile, an intratemporal relation matching task is promoted to improve the temporal discriminability of the model. Moreover, to overcome the single task domination problem in this multitask framework, an uncertainty-based dynamic weighting mechanism is utilized to automatically distribute weight for each task according to its uncertainty, which ensures the stability of multitask optimization. Experiments on open-source and self-designed datasets demonstrate the superiority of the proposed framework over other supervised and semisupervised methods.