Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings. Ding, Y., Zhuang, J., Ding, P., & Jia, M. Reliability Engineering & System Safety, 218:108126, February, 2022. Paper doi abstract bibtex Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.
@article{ding_self-supervised_2022,
title = {Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings},
volume = {218},
issn = {0951-8320},
url = {https://www.sciencedirect.com/science/article/pii/S0951832021006207},
doi = {10.1016/j.ress.2021.108126},
abstract = {Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.},
language = {en},
urldate = {2021-11-29},
journal = {Reliability Engineering \& System Safety},
author = {Ding, Yifei and Zhuang, Jichao and Ding, Peng and Jia, Minping},
month = feb,
year = {2022},
keywords = {Data augmentation, Fault diagnosis, Incipient fault detection, Prognostic and health management, Self-supervised pretraining, Unsupervised learning},
pages = {108126},
}
Downloads: 0
{"_id":"rcTkfj26EdogTrEbt","bibbaseid":"ding-zhuang-ding-jia-selfsupervisedpretrainingviacontrastlearningforintelligentincipientfaultdetectionofbearings-2022","author_short":["Ding, Y.","Zhuang, J.","Ding, P.","Jia, M."],"bibdata":{"bibtype":"article","type":"article","title":"Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings","volume":"218","issn":"0951-8320","url":"https://www.sciencedirect.com/science/article/pii/S0951832021006207","doi":"10.1016/j.ress.2021.108126","abstract":"Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.","language":"en","urldate":"2021-11-29","journal":"Reliability Engineering & System Safety","author":[{"propositions":[],"lastnames":["Ding"],"firstnames":["Yifei"],"suffixes":[]},{"propositions":[],"lastnames":["Zhuang"],"firstnames":["Jichao"],"suffixes":[]},{"propositions":[],"lastnames":["Ding"],"firstnames":["Peng"],"suffixes":[]},{"propositions":[],"lastnames":["Jia"],"firstnames":["Minping"],"suffixes":[]}],"month":"February","year":"2022","keywords":"Data augmentation, Fault diagnosis, Incipient fault detection, Prognostic and health management, Self-supervised pretraining, Unsupervised learning","pages":"108126","bibtex":"@article{ding_self-supervised_2022,\n\ttitle = {Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings},\n\tvolume = {218},\n\tissn = {0951-8320},\n\turl = {https://www.sciencedirect.com/science/article/pii/S0951832021006207},\n\tdoi = {10.1016/j.ress.2021.108126},\n\tabstract = {Data-driven approaches for prognostic and health management (PHM) increasingly rely on massive historical data, yet annotations are expensive and time-consuming. Learning approaches that utilize semi-labeled or unlabeled data are becoming increasingly popular. In this paper, a self-supervised pre-training via contrast learning (SSPCL) is introduced to learn discriminative representations from unlabeled bearing datasets. Specifically, the SSPCL employs momentum contrast learning (MCL) to investigate the local representation in terms of instance-level discrimination contrast. Further, we propose a specific architecture for SSPCL deployment on bearing vibration signals by presenting several data augmentations for 1D sequences. On this basis, we put forward an incipient fault detection method based on SSPCL for run-to-failure cycle of rolling bearings. This approach transfers the SSPCL pre-trained model to a specific semi-supervised downstream task, effectively utilizing all unlabeled data and relying on only a little priori knowledge. A case study on FEMTO-ST datasets shows that the fine-tuned model is competent for incipient fault detection, outperforming other state-of-the-art methods. Furthermore, a supplemental case on a self-built fault datasets further demonstrate the great potential and superiority of our proposed SSPCL method in PHM.},\n\tlanguage = {en},\n\turldate = {2021-11-29},\n\tjournal = {Reliability Engineering \\& System Safety},\n\tauthor = {Ding, Yifei and Zhuang, Jichao and Ding, Peng and Jia, Minping},\n\tmonth = feb,\n\tyear = {2022},\n\tkeywords = {Data augmentation, Fault diagnosis, Incipient fault detection, Prognostic and health management, Self-supervised pretraining, Unsupervised learning},\n\tpages = {108126},\n}\n\n\n\n","author_short":["Ding, Y.","Zhuang, J.","Ding, P.","Jia, M."],"key":"ding_self-supervised_2022","id":"ding_self-supervised_2022","bibbaseid":"ding-zhuang-ding-jia-selfsupervisedpretrainingviacontrastlearningforintelligentincipientfaultdetectionofbearings-2022","role":"author","urls":{"Paper":"https://www.sciencedirect.com/science/article/pii/S0951832021006207"},"keyword":["Data augmentation","Fault diagnosis","Incipient fault detection","Prognostic and health management","Self-supervised pretraining","Unsupervised learning"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"article","biburl":"https://bibbase.org/zotero/mh_lenguyen","dataSources":["iwKepCrWBps7ojhDx"],"keywords":["data augmentation","fault diagnosis","incipient fault detection","prognostic and health management","self-supervised pretraining","unsupervised learning"],"search_terms":["self","supervised","pretraining","via","contrast","learning","intelligent","incipient","fault","detection","bearings","ding","zhuang","ding","jia"],"title":"Self-supervised pretraining via contrast learning for intelligent incipient fault detection of bearings","year":2022}