Negative emotions of drivers may lead to some dangerous driving behaviors, which in turn lead to serious traffic accidents. However, most of the current studies on driver emotions use a single modality, such as EEG, eye trackers, and driving data. In complex situations, a single modality may not be able to fully consider a driver's complete emotional characteristics and provides poor robustness. In recent years, some studies have used multimodal thinking to monitor single emotions such as driver fatigue and anger, but in actual driving environments, negative emotions such as sadness, anger, fear, and fatigue all have a significant impact on driving safety. However, there are very few research cases using multimodal data to accurately predict drivers' comprehensive emotions. Therefore, based on the multi-modal idea, this paper aims to improve drivers' comprehensive emotion recognition. By combining the three modalities of a driver's voice, facial image, and video sequence, the six classification tasks of drivers' emotions are performed as follows: sadness, anger, fear, fatigue, happiness, and emotional neutrality. In order to accurately identify drivers' negative emotions to improve driving safety, this paper proposes a multi-modal fusion framework based on the CNN + Bi-LSTM + HAM to identify driver emotions. The framework fuses feature vectors of driver audio, facial expressions, and video sequences for comprehensive driver emotion recognition. Experiments have proved the effectiveness of the multi-modal data proposed in this paper for driver emotion recognition, and its recognition accuracy has reached 85.52%. At the same time, the validity of this method is verified by comparing experiments and evaluation indicators such as accuracy and F1 score.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10574905 | PMC |
http://dx.doi.org/10.3390/s23198293 | DOI Listing |
Cogn Neurodyn
December 2025
Department of Computer Science and Engineering, Sathyabama Institute of Science and Technology, Chennai, TamilNadu India.
Emotion recognition plays a crucial role in brain-computer interfaces (BCI) which helps to identify and classify human emotions as positive, negative, and neutral. Emotion analysis in BCI maintains a substantial perspective in distinct fields such as healthcare, education, gaming, and human-computer interaction. In healthcare, emotion analysis based on electroencephalography (EEG) signals is deployed to provide personalized support for patients with autism or mood disorders.
View Article and Find Full Text PDFJMIR Ment Health
January 2025
The Samueli Initiative for Responsible AI in Medicine, Tel Aviv University, Tel Aviv, Israel.
Generative artificial intelligence (GenAI) shows potential for personalized care, psychoeducation, and even crisis prediction in mental health, yet responsible use requires ethical consideration and deliberation and perhaps even governance. This is the first published theme issue focused on responsible GenAI in mental health. It brings together evidence and insights on GenAI's capabilities, such as emotion recognition, therapy-session summarization, and risk assessment, while highlighting the sensitive nature of mental health data and the need for rigorous validation.
View Article and Find Full Text PDFJ Prev Alzheimers Dis
February 2025
Department of Medicine, Ramathibodi Hospital, Mahidol University, Bangkok, Thailand.
Background: Cognitive training (CT) has been one of the important non-pharmaceutical interventions that could delay cognitive decline. Currently, no definite CT methods are available. Furthermore, little attention has been paid to the effect of CT on mood and instrumental activities of daily living (IADL).
View Article and Find Full Text PDFBrain Res
January 2025
epartment of Basic Medicine, School of Medicine, Hangzhou City University, Hangzhou, Zhejiang 310015, China; Key Laboratory of Novel Targets and Drug Study for Neural Repair of Zhejiang Province, School of Medicine, Hangzhou City University, Hangzhou, Zhejiang 310015, China. Electronic address:
Whisker deprivation at different stages of early development results in varied behavioral outcomes. However, there is a notable lack of systematic studies evaluating the specific effects of whisker deprivation from postnatal day 0 (P0) to P14 on adolescent behavioral performance in mice. To investigate these effects, C57BL/6J mice underwent whisker deprivation from P0 to P14 and were subsequently assessed at 5 weeks of age using a battery of tests: motor skills were evaluated using open field test; emotional behavior was evaluated using a series of anxiety- and depression-related behavioral tests; cognitive function was examined via novel location and object recognition tests; and social interactions were analyzed using three-chamber social interaction test.
View Article and Find Full Text PDFJ Integr Neurosci
January 2025
Department of Psychology, The Affiliated Hospital of Jiangnan University, 214151 Wuxi, Jiangsu, China.
Background: Deficits in emotion recognition have been shown to be closely related to social-cognitive functioning in schizophrenic. This study aimed to investigate the event-related potential (ERP) characteristics of social perception in schizophrenia patients and to explore the neural mechanisms underlying these abnormal cognitive processes related to social perception.
Methods: Participants included 33 schizophrenia patients and 35 healthy controls (HCs).
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!