Issues and Advances in Anomaly Detection Evaluation for Joint Human-Automated Systems. Rieth, C. A., Amsel, B. D., Tran, R., & Cook, M. B. In Advances in Human Factors in Robots and Unmanned Systems, of Advances in Intelligent Systems and Computing, pages 52–63, Cham, 2018. Springer International Publishing.
doi  abstract   bibtex   
As human-managed systems become more complex, automated anomaly detection can provide assistance—but only if it is effective. Rigorous evaluation of automated detection is vital for determining its effectiveness before implementation into systems. We identified recurring issues in evaluation practices limiting the conclusions that can be applied from published studies to broader application. In this paper, we demonstrate the implications of these issues and illustrate solutions. We show how receiver operating characteristic curves can reveal performance tradeoffs masked by reporting of single metric results and how using multiple simulation data examples can prevent biases that result from evaluation using single training and testing examples. We also provide methods for incorporating detection latency into tradeoff analyses. Application of these methods will help to provide researchers, engineers, and decision makers with a more objective basis for anomaly detection performance evaluation, resulting in greater utility, better performance, and cost savings in systems engineering.
@inproceedings{rieth_issues_2018,
	address = {Cham},
	series = {Advances in {Intelligent} {Systems} and {Computing}},
	title = {Issues and {Advances} in {Anomaly} {Detection} {Evaluation} for {Joint} {Human}-{Automated} {Systems}},
	isbn = {978-3-319-60384-1},
	doi = {10.1007/978-3-319-60384-1_6},
	abstract = {As human-managed systems become more complex, automated anomaly detection can provide assistance—but only if it is effective. Rigorous evaluation of automated detection is vital for determining its effectiveness before implementation into systems. We identified recurring issues in evaluation practices limiting the conclusions that can be applied from published studies to broader application. In this paper, we demonstrate the implications of these issues and illustrate solutions. We show how receiver operating characteristic curves can reveal performance tradeoffs masked by reporting of single metric results and how using multiple simulation data examples can prevent biases that result from evaluation using single training and testing examples. We also provide methods for incorporating detection latency into tradeoff analyses. Application of these methods will help to provide researchers, engineers, and decision makers with a more objective basis for anomaly detection performance evaluation, resulting in greater utility, better performance, and cost savings in systems engineering.},
	language = {en},
	booktitle = {Advances in {Human} {Factors} in {Robots} and {Unmanned} {Systems}},
	publisher = {Springer International Publishing},
	author = {Rieth, Cory A. and Amsel, Ben D. and Tran, Randy and Cook, Maia B.},
	editor = {Chen, Jessie},
	year = {2018},
	keywords = {Anomaly detection, Automation evaluation, Receiver operating characteristic, Tennessee Eastman process simulation},
	pages = {52--63},
}

Downloads: 0