Equipment Health Indicator Learning Using Deep Reinforcement Learning. Zhang, C., Gupta, C., Farahat, A., Ristovski, K., & Ghosh, D. In Brefeld, U., Curry, E., Daly, E., MacNamee, B., Marascu, A., Pinelli, F., Berlingerio, M., & Hurley, N., editors, Machine Learning and Knowledge Discovery in Databases, of Lecture Notes in Computer Science, pages 488–504, Cham, 2019. Springer International Publishing.
doi  abstract   bibtex   
Predictive Maintenance (PdM) is gaining popularity in industrial operations as it leverages the power of Machine Learning and Internet of Things (IoT) to predict the future health status of equipment. Health Indicator Learning (HIL) plays an important role in PdM as it learns a health curve representing the health conditions of equipment over time, so that health degradation is visually monitored and optimal planning can be performed accordingly to minimize the equipment downtime. However, HIL is a hard problem due to the fact that there is usually no way to access the actual health of the equipment during most of its operation. Traditionally, HIL is addressed by hand-crafting domain-specific performance indicators or through physical modeling, which is expensive and inapplicable for some industries. In this paper, we propose a purely data-driven approach for solving the HIL problem based on Deep Reinforcement Learning (DRL). Our key insight is that the HIL problem can be mapped to a credit assignment problem. Then DRL learns from failures by naturally backpropagating the credit of failures into intermediate states. In particular, given the observed time series of sensor, operating and event (failure) data, we learn a sequence of health indicators that represent the underlying health conditions of physical equipment. We demonstrate that the proposed methods significantly outperform the state-of-the-art methods for HIL and provide explainable insights about the equipment health. In addition, we propose the use of the learned health indicators to predict when the equipment is going to reach its end-of-life, and demonstrate how an explainable health curve is way more useful for a decision maker than a single-number prediction by a black-box model. The proposed approach has a great potential in a broader range of systems (e.g., economical and biological) as a general framework for the automatic learning of the underlying performance of complex systems.
@inproceedings{zhang_equipment_2019,
	address = {Cham},
	series = {Lecture {Notes} in {Computer} {Science}},
	title = {Equipment {Health} {Indicator} {Learning} {Using} {Deep} {Reinforcement} {Learning}},
	isbn = {978-3-030-10997-4},
	doi = {10.1007/978-3-030-10997-4_30},
	abstract = {Predictive Maintenance (PdM) is gaining popularity in industrial operations as it leverages the power of Machine Learning and Internet of Things (IoT) to predict the future health status of equipment. Health Indicator Learning (HIL) plays an important role in PdM as it learns a health curve representing the health conditions of equipment over time, so that health degradation is visually monitored and optimal planning can be performed accordingly to minimize the equipment downtime. However, HIL is a hard problem due to the fact that there is usually no way to access the actual health of the equipment during most of its operation. Traditionally, HIL is addressed by hand-crafting domain-specific performance indicators or through physical modeling, which is expensive and inapplicable for some industries. In this paper, we propose a purely data-driven approach for solving the HIL problem based on Deep Reinforcement Learning (DRL). Our key insight is that the HIL problem can be mapped to a credit assignment problem. Then DRL learns from failures by naturally backpropagating the credit of failures into intermediate states. In particular, given the observed time series of sensor, operating and event (failure) data, we learn a sequence of health indicators that represent the underlying health conditions of physical equipment. We demonstrate that the proposed methods significantly outperform the state-of-the-art methods for HIL and provide explainable insights about the equipment health. In addition, we propose the use of the learned health indicators to predict when the equipment is going to reach its end-of-life, and demonstrate how an explainable health curve is way more useful for a decision maker than a single-number prediction by a black-box model. The proposed approach has a great potential in a broader range of systems (e.g., economical and biological) as a general framework for the automatic learning of the underlying performance of complex systems.},
	language = {en},
	booktitle = {Machine {Learning} and {Knowledge} {Discovery} in {Databases}},
	publisher = {Springer International Publishing},
	author = {Zhang, Chi and Gupta, Chetan and Farahat, Ahmed and Ristovski, Kosta and Ghosh, Dipanjan},
	editor = {Brefeld, Ulf and Curry, Edward and Daly, Elizabeth and MacNamee, Brian and Marascu, Alice and Pinelli, Fabio and Berlingerio, Michele and Hurley, Neil},
	year = {2019},
	keywords = {Deep Reinforcement Learning, Health indicator learning, Predictive Maintenance},
	pages = {488--504},
}

Downloads: 0