Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data-driven perception-based psychophysical method combined with the visuo-haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real-time display of six pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change [Formula: see text] and activation delay [Formula: see text]. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale from "strongly disagree" to "strongly agree". Each participant ([Formula: see text], 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed facial expressions rated as most appropriate by all participants comprise a higher rate of change and shorter delay from upper face AUs (around the eyes) to those in the lower face (around the mouth). In contrast, we found that transient parameter values of most appropriate-rated pain facial expressions, palpation forces, and delays between palpation actions varied across participant-simulated patient pairs according to gender and ethnicity. These findings suggest that gender and ethnicity biases affect palpation strategies and the perception of pain facial expressions displayed on MorphFace. We anticipate that our approach will be used to generate physical examination models with diverse patient demographics to reduce erroneous judgments in medical students, and provide focused training to address these errors.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8913843PMC
http://dx.doi.org/10.1038/s41598-022-08115-1DOI Listing

Publication Analysis

Top Keywords

facial expressions
36
pain facial
20
medical training
16
training simulators
12
medical students
12
physical examination
12
[formula text]
12
facial
11
expressions
9
medical
9

Similar Publications

Hemifacial microsomia (HFM) is a rare congenital disorder that affects facial symmetry, ear development, and other congenital anomalies. However, known causal genes account for only approximately 6% of patients, indicating the need to discover more pathogenic genes. Association tests demonstrated an association between common variants in SHROOM3 and HFM (P = 1.

View Article and Find Full Text PDF

GorillaFACS: The Facial Action Coding System for the Gorilla spp.

PLoS One

January 2025

Human Biology & Primate Cognition Department, Institute of Biology, Leipzig University, Leipzig, Germany.

The Facial Action Coding System (FACS) is an objective observation tool for measuring human facial behaviour. It avoids subjective attributions of meaning by objectively measuring independent movements linked to facial muscles, called Action Units (AUs). FACS has been adapted to 11 other taxa, including most apes, macaques and domestic animals, but not yet gorillas.

View Article and Find Full Text PDF

Introduction: Systemic lupus erythematosus (SLE) is a chronic inflammatory autoimmune disease that affects various body systems, including the skin and facial features. Estrogen promotes lupus in human and mouse models of SLE. In this study, we conducted an in vivo study to investigate the relationship between two estrogen receptors (ERα and ERβ) and platelet-activating factor acetylhydrolase (PAF-AH) on the symptoms of SLE.

View Article and Find Full Text PDF

Emotion perception is a fundamental aspect of our lives because others' emotions may provide important information about their reactions, attitudes, intentions, and behavior. Following the seminal work of Ekman, much of the research on emotion perception has focused on facial expressions. Recent evidence suggests, however, that facial expressions may be more ambiguous than previously assumed and that context also plays an important role in deciphering the emotional states of others.

View Article and Find Full Text PDF

There has been an increased interest in standardized approaches to coding facial movement in mammals. Such approaches include Facial Action Coding Systems (FACS), where individuals are trained to identify discrete facial muscle movements that combine to create a facial configuration. Some studies have utilized FACS to analyze facial signaling, recording the quantity of morphologically distinct facial signals a species can generate.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!