A natural language processing system that links medical terms in electronic health record notes to lay definitions: system development using physician reviews. Chen, J., Druhl, E., Polepalli Ramesh, B., Houston, T. K., Brandt, C. A., Zulman, D. M., Vimalananda, V. G., Malkani, S., & Yu, H. Journal of Medical Internet Research, 20(1):e26, January, 2018. doi abstract bibtex BACKGROUND: Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes. OBJECTIVE: The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people. METHODS: NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions. RESULTS: Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed. CONCLUSIONS: Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.
@article{chen_natural_2018,
title = {A natural language processing system that links medical terms in electronic health record notes to lay definitions: system development using physician reviews},
volume = {20},
issn = {1438-8871},
shorttitle = {A natural language processing system that links medical terms in electronic health record notes to lay definitions},
doi = {10.2196/jmir.8669},
abstract = {BACKGROUND: Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes.
OBJECTIVE: The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people.
METHODS: NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions.
RESULTS: Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed.
CONCLUSIONS: Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.},
language = {eng},
number = {1},
journal = {Journal of Medical Internet Research},
author = {Chen, Jinying and Druhl, Emily and Polepalli Ramesh, Balaji and Houston, Thomas K. and Brandt, Cynthia A. and Zulman, Donna M. and Vimalananda, Varsha G. and Malkani, Samir and Yu, Hong},
month = jan,
year = {2018},
pmid = {29358159 PMCID: PMC5799720},
keywords = {computer software, consumer health informatics, electronic health records, natural language processing, usability testing},
pages = {e26},
}
Downloads: 0
{"_id":"8NKAL4py4GAZqtCaL","bibbaseid":"chen-druhl-polepalliramesh-houston-brandt-zulman-vimalananda-malkani-etal-anaturallanguageprocessingsystemthatlinksmedicaltermsinelectronichealthrecordnotestolaydefinitionssystemdevelopmentusingphysicianreviews-2018","author_short":["Chen, J.","Druhl, E.","Polepalli Ramesh, B.","Houston, T. K.","Brandt, C. A.","Zulman, D. M.","Vimalananda, V. G.","Malkani, S.","Yu, H."],"bibdata":{"bibtype":"article","type":"article","title":"A natural language processing system that links medical terms in electronic health record notes to lay definitions: system development using physician reviews","volume":"20","issn":"1438-8871","shorttitle":"A natural language processing system that links medical terms in electronic health record notes to lay definitions","doi":"10.2196/jmir.8669","abstract":"BACKGROUND: Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes. OBJECTIVE: The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people. METHODS: NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions. RESULTS: Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed. CONCLUSIONS: Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.","language":"eng","number":"1","journal":"Journal of Medical Internet Research","author":[{"propositions":[],"lastnames":["Chen"],"firstnames":["Jinying"],"suffixes":[]},{"propositions":[],"lastnames":["Druhl"],"firstnames":["Emily"],"suffixes":[]},{"propositions":[],"lastnames":["Polepalli","Ramesh"],"firstnames":["Balaji"],"suffixes":[]},{"propositions":[],"lastnames":["Houston"],"firstnames":["Thomas","K."],"suffixes":[]},{"propositions":[],"lastnames":["Brandt"],"firstnames":["Cynthia","A."],"suffixes":[]},{"propositions":[],"lastnames":["Zulman"],"firstnames":["Donna","M."],"suffixes":[]},{"propositions":[],"lastnames":["Vimalananda"],"firstnames":["Varsha","G."],"suffixes":[]},{"propositions":[],"lastnames":["Malkani"],"firstnames":["Samir"],"suffixes":[]},{"propositions":[],"lastnames":["Yu"],"firstnames":["Hong"],"suffixes":[]}],"month":"January","year":"2018","pmid":"29358159 PMCID: PMC5799720","keywords":"computer software, consumer health informatics, electronic health records, natural language processing, usability testing","pages":"e26","bibtex":"@article{chen_natural_2018,\n\ttitle = {A natural language processing system that links medical terms in electronic health record notes to lay definitions: system development using physician reviews},\n\tvolume = {20},\n\tissn = {1438-8871},\n\tshorttitle = {A natural language processing system that links medical terms in electronic health record notes to lay definitions},\n\tdoi = {10.2196/jmir.8669},\n\tabstract = {BACKGROUND: Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes.\nOBJECTIVE: The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people.\nMETHODS: NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions.\nRESULTS: Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed.\nCONCLUSIONS: Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.},\n\tlanguage = {eng},\n\tnumber = {1},\n\tjournal = {Journal of Medical Internet Research},\n\tauthor = {Chen, Jinying and Druhl, Emily and Polepalli Ramesh, Balaji and Houston, Thomas K. and Brandt, Cynthia A. and Zulman, Donna M. and Vimalananda, Varsha G. and Malkani, Samir and Yu, Hong},\n\tmonth = jan,\n\tyear = {2018},\n\tpmid = {29358159 PMCID: PMC5799720},\n\tkeywords = {computer software, consumer health informatics, electronic health records, natural language processing, usability testing},\n\tpages = {e26},\n}\n\n","author_short":["Chen, J.","Druhl, E.","Polepalli Ramesh, B.","Houston, T. K.","Brandt, C. A.","Zulman, D. M.","Vimalananda, V. G.","Malkani, S.","Yu, H."],"key":"chen_natural_2018","id":"chen_natural_2018","bibbaseid":"chen-druhl-polepalliramesh-houston-brandt-zulman-vimalananda-malkani-etal-anaturallanguageprocessingsystemthatlinksmedicaltermsinelectronichealthrecordnotestolaydefinitionssystemdevelopmentusingphysicianreviews-2018","role":"author","urls":{},"keyword":["computer software","consumer health informatics","electronic health records","natural language processing","usability testing"],"metadata":{"authorlinks":{}},"html":""},"bibtype":"article","biburl":"http://fenway.cs.uml.edu/papers/pubs-all.bib","dataSources":["TqaA9miSB65nRfS5H"],"keywords":["computer software","consumer health informatics","electronic health records","natural language processing","usability testing"],"search_terms":["natural","language","processing","system","links","medical","terms","electronic","health","record","notes","lay","definitions","system","development","using","physician","reviews","chen","druhl","polepalli ramesh","houston","brandt","zulman","vimalananda","malkani","yu"],"title":"A natural language processing system that links medical terms in electronic health record notes to lay definitions: system development using physician reviews","year":2018}