A controlled trial of automated classification of negation from clinical notes

Peter L. Elkin, Steven H. Brown, Brent A Bauer, Casey S. Husser, William Carruth, Larry R. Bergstrom, Dietlind L. Wahner-Roedler

Research output: Contribution to journalArticle

65 Citations (Scopus)

Abstract

Background: Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation). Methods: 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations) were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results: SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall) of the assignment of negation was 97.2% (p < 0.001, Pearson Chi-Square test; when compared to a coin flip). The specificity of assignment of negation was 98.8%. The positive likelihood ratio of the negation was 81. The positive predictive value (Precision) was 91.2% Conclusion: Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7% of the concepts being negated.

Original languageEnglish (US)
JournalBMC Medical Informatics and Decision Making
Volume5
DOIs
StatePublished - May 5 2005

Fingerprint

Systematized Nomenclature of Medicine
Numismatics
Vocabulary
Electronic Health Records
Health
Chi-Square Distribution
Internal Medicine
Physical Examination
History
Sensitivity and Specificity

ASJC Scopus subject areas

  • Medicine(all)

Cite this

A controlled trial of automated classification of negation from clinical notes. / Elkin, Peter L.; Brown, Steven H.; Bauer, Brent A; Husser, Casey S.; Carruth, William; Bergstrom, Larry R.; Wahner-Roedler, Dietlind L.

In: BMC Medical Informatics and Decision Making, Vol. 5, 05.05.2005.

Research output: Contribution to journalArticle

Elkin, Peter L. ; Brown, Steven H. ; Bauer, Brent A ; Husser, Casey S. ; Carruth, William ; Bergstrom, Larry R. ; Wahner-Roedler, Dietlind L. / A controlled trial of automated classification of negation from clinical notes. In: BMC Medical Informatics and Decision Making. 2005 ; Vol. 5.
@article{d055b116562c4880b10f563a8b26435e,
title = "A controlled trial of automated classification of negation from clinical notes",
abstract = "Background: Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation). Methods: 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations) were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results: SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall) of the assignment of negation was 97.2{\%} (p < 0.001, Pearson Chi-Square test; when compared to a coin flip). The specificity of assignment of negation was 98.8{\%}. The positive likelihood ratio of the negation was 81. The positive predictive value (Precision) was 91.2{\%} Conclusion: Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7{\%} of the concepts being negated.",
author = "Elkin, {Peter L.} and Brown, {Steven H.} and Bauer, {Brent A} and Husser, {Casey S.} and William Carruth and Bergstrom, {Larry R.} and Wahner-Roedler, {Dietlind L.}",
year = "2005",
month = "5",
day = "5",
doi = "10.1186/1472-6947-5-13",
language = "English (US)",
volume = "5",
journal = "BMC Medical Informatics and Decision Making",
issn = "1472-6947",
publisher = "BioMed Central",

}

TY - JOUR

T1 - A controlled trial of automated classification of negation from clinical notes

AU - Elkin, Peter L.

AU - Brown, Steven H.

AU - Bauer, Brent A

AU - Husser, Casey S.

AU - Carruth, William

AU - Bergstrom, Larry R.

AU - Wahner-Roedler, Dietlind L.

PY - 2005/5/5

Y1 - 2005/5/5

N2 - Background: Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation). Methods: 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations) were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results: SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall) of the assignment of negation was 97.2% (p < 0.001, Pearson Chi-Square test; when compared to a coin flip). The specificity of assignment of negation was 98.8%. The positive likelihood ratio of the negation was 81. The positive predictive value (Precision) was 91.2% Conclusion: Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7% of the concepts being negated.

AB - Background: Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation). Methods: 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations) were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results: SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall) of the assignment of negation was 97.2% (p < 0.001, Pearson Chi-Square test; when compared to a coin flip). The specificity of assignment of negation was 98.8%. The positive likelihood ratio of the negation was 81. The positive predictive value (Precision) was 91.2% Conclusion: Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7% of the concepts being negated.

UR - http://www.scopus.com/inward/record.url?scp=23144436097&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=23144436097&partnerID=8YFLogxK

U2 - 10.1186/1472-6947-5-13

DO - 10.1186/1472-6947-5-13

M3 - Article

VL - 5

JO - BMC Medical Informatics and Decision Making

JF - BMC Medical Informatics and Decision Making

SN - 1472-6947

ER -