Twenty-two right-handed subjects were asked either to identify sad, neutral, or laughing faces presented on a computer monitor or to view passively the same pictures without classification. Visual evoked potentials were recorded from F3/4, C3/4, P3/4, O1/2, and T5/6 derivations. In comparison with the passive viewing, the emotion recognition was characterized by the higher level of cortical activity reflected in the higher N1, N2, N3 amplitudes and shortened latencies of N1, P2, and N2 waves. In contrast, the latencies of the later P3 and N3 waves were longer. During emotion recognition, the dynamical brain mapping technique revealed symmetrical activation of the frontocentral areas and only the right-side activation during the passive perception. Factor analysis demonstrated a complication of N2 structure in the task of face emotion recognition. A principal component corresponding to the descending part of the N2 wave was revealed, which probably reflected the stage of image classification.

Download full-text PDF

Source

Publication Analysis

Top Keywords

emotion recognition
12
cortical activity
8
passive perception
8
latencies waves
8
[the evoked
4
evoked cortical
4
activity cerebral
4
cerebral hemispheres
4
hemispheres man
4
man active
4

Similar Publications

Facial emotion recognition (FER) can serve as a valuable tool for assessing emotional states, which are often linked to mental health. However, mental health encompasses a broad range of factors that go beyond facial expressions. While FER provides insights into certain aspects of emotional well-being, it can be used in conjunction with other assessments to form a more comprehensive understanding of an individual's mental health.

View Article and Find Full Text PDF

Cross-regional cultural recognition of adolescent voice emotion.

Front Psychol

December 2024

Department of Psychology, Shaoxing University, Shaoxing, Zhejiang, China.

Background: In previous studies, an in-group advantage in emotion recognition has been demonstrated to suggest that individuals are more proficient in identifying emotions within their own culture than in other cultures. However, the existing research focuses mainly on the cross-cultural variations in vocal emotion recognition, with limited attention paid to exploring intracultural differences. Furthermore, there is little research conducted on the ability of adolescents to recognize the emotions conveyed by vocal cues in various cultural settings.

View Article and Find Full Text PDF

Explicit metrics for implicit emotions: investigating physiological and gaze indices of learner emotions.

Front Psychol

December 2024

Departent of Learning, Data-Analytics and Technology, Faculty of Behavioural, Management and Social Sciences, University of Twente, Enschede, Netherlands.

Learning experiences are intertwined with emotions, which in turn have a significant effect on learning outcomes. Therefore, digital learning environments can benefit from taking the emotional state of the learner into account. To do so, the first step is real-time emotion detection which is made possible by sensors that can continuously collect physiological and eye-tracking data.

View Article and Find Full Text PDF

Affective Theory of Mind (ToM) is the ability to understand other peoples' emotional states and feelings. Several studies showed impaired affective ToM abilities in people with Parkinson's disease (PD). However, most studies tested this ability by using single-stimulus modality tasks (visual cues).

View Article and Find Full Text PDF

Autism spectrum disorder (ASD) involves challenges in communication and social interaction, including challenges in recognizing emotions. Existing technological solutions aim to improve social behaviors in individuals with ASD by providing learning aids. This paper presents a real-time environmental translator designed to enhance social behaviors in individuals with ASD using sensory substitution.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!