Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings. Ding, Y., Zhuang, J., Ding, P., & Jia, M. Reliability Engineering & System Safety, 218:108126, February, 2022.
Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings [link]Paper  doi  abstract   bibtex   
Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.
@article{ding_self-supervised_2022,
	title = {Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings},
	volume = {218},
	issn = {0951-8320},
	url = {https://www.sciencedirect.com/science/article/pii/S0951832021006207},
	doi = {10.1016/j.ress.2021.108126},
	abstract = {Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.},
	language = {en},
	urldate = {2021-11-29},
	journal = {Reliability Engineering \& System Safety},
	author = {Ding, Yifei and Zhuang, Jichao and Ding, Peng and Jia, Minping},
	month = feb,
	year = {2022},
	keywords = {Data augmentation, Fault diagnosis, Incipient fault detection, Prognostic and health management, Self-supervised pretraining, Unsupervised learning},
	pages = {108126},
}

Downloads: 0