Condensed Memory Networks for Clinical Diagnostic Inferencing. Prakash, A., Zhao, S., Hasan, S. A., Datla, V., Lee, K., Qadir, A., Liu, J., & Farri, O. arXiv:1612.01848 [cs], December, 2016. arXiv: 1612.01848
Condensed Memory Networks for Clinical Diagnostic Inferencing [link]Paper  abstract   bibtex   
Diagnosis of a clinical condition is a challenging task, which often requires significant medical investigation. Previous work related to diagnostic inferencing problems mostly consider multivariate observational data (e.g. physiological signals, lab tests etc.). In contrast, we explore the problem using free-text medical notes recorded in an electronic health record (EHR). Complex tasks like these can benefit from structured knowledge bases, but those are not scalable. We instead exploit raw text from Wikipedia as a knowledge source. Memory networks have been demonstrated to be effective in tasks which require comprehension of free-form text. They use the final iteration of the learned representation to predict probable classes. We introduce condensed memory neural networks (C-MemNNs), a novel model with iterative condensation of memory representations that preserves the hierarchy of features in the memory. Experiments on the MIMIC-III dataset show that the proposed model outperforms other variants of memory networks to predict the most probable diagnoses given a complex clinical scenario.
@article{prakash_condensed_2016,
	title = {Condensed {Memory} {Networks} for {Clinical} {Diagnostic} {Inferencing}},
	url = {http://arxiv.org/abs/1612.01848},
	abstract = {Diagnosis of a clinical condition is a challenging task, which often requires significant medical investigation. Previous work related to diagnostic inferencing problems mostly consider multivariate observational data (e.g. physiological signals, lab tests etc.). In contrast, we explore the problem using free-text medical notes recorded in an electronic health record (EHR). Complex tasks like these can benefit from structured knowledge bases, but those are not scalable. We instead exploit raw text from Wikipedia as a knowledge source. Memory networks have been demonstrated to be effective in tasks which require comprehension of free-form text. They use the final iteration of the learned representation to predict probable classes. We introduce condensed memory neural networks (C-MemNNs), a novel model with iterative condensation of memory representations that preserves the hierarchy of features in the memory. Experiments on the MIMIC-III dataset show that the proposed model outperforms other variants of memory networks to predict the most probable diagnoses given a complex clinical scenario.},
	urldate = {2017-01-12},
	journal = {arXiv:1612.01848 [cs]},
	author = {Prakash, Aaditya and Zhao, Siyuan and Hasan, Sadid A. and Datla, Vivek and Lee, Kathy and Qadir, Ashequl and Liu, Joey and Farri, Oladimeji},
	month = dec,
	year = {2016},
	note = {arXiv: 1612.01848},
	keywords = {Computer Science - Computation and Language},
}

Downloads: 0