AI Article Synopsis

  • Racism and implicit bias lead to inequities in health care, prompting this study to focus on how stigmatizing language in electronic health records affects health disparities.
  • The research involved a scoping review of existing literature, sourcing studies from various databases to analyze the presence of stigmatizing language in clinician notes up to April 2022.
  • Findings revealed that negative language used by clinicians can adversely influence patient experiences and outcomes, suggesting that using Natural Language Processing (NLP) could help identify and mitigate this issue in health documentation.

Article Abstract

Background: Racism and implicit bias underlie disparities in health care access, treatment, and outcomes. An emerging area of study in examining health disparities is the use of stigmatizing language in the electronic health record (EHR).

Objectives: We sought to summarize the existing literature related to stigmatizing language documented in the EHR. To this end, we conducted a scoping review to identify, describe, and evaluate the current body of literature related to stigmatizing language and clinician notes.

Methods: We searched PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), and Embase databases in May 2022, and also conducted a hand search of IEEE to identify studies investigating stigmatizing language in clinical documentation. We included all studies published through April 2022. The results for each search were uploaded into EndNote X9 software, de-duplicated using the Bramer method, and then exported to Covidence software for title and abstract screening.

Results: Studies (N = 9) used cross-sectional (n = 3), qualitative (n = 3), mixed methods (n = 2), and retrospective cohort (n = 1) designs. Stigmatizing language was defined via content analysis of clinical documentation (n = 4), literature review (n = 2), interviews with clinicians (n = 3) and patients (n = 1), expert panel consultation, and task force guidelines (n = 1). Natural language processing was used in four studies to identify and extract stigmatizing words from clinical notes. All of the studies reviewed concluded that negative clinician attitudes and the use of stigmatizing language in documentation could negatively impact patient perception of care or health outcomes.

Discussion: The current literature indicates that NLP is an emerging approach to identifying stigmatizing language documented in the EHR. NLP-based solutions can be developed and integrated into routine documentation systems to screen for stigmatizing language and alert clinicians or their supervisors. Potential interventions resulting from this research could generate awareness about how implicit biases affect communication patterns and work to achieve equitable health care for diverse populations.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11213326PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0303653PLOS

Publication Analysis

Top Keywords

stigmatizing language
36
clinical documentation
12
language
10
stigmatizing
9
identifying stigmatizing
8
language clinical
8
scoping review
8
health care
8
literature stigmatizing
8
language documented
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!