Electroencephalograms provide a non-invasive and effective method for studying emotion recognition and developing Artificial Intelligence (AI) models to understand human behavior and decision-making processes. This study involved testing several machine learning classification kernels to develop an accurate emotion recognition model capable of classifying emotions stimuli such as "Boring," "Calm," "Happy," and "Fear" during gameplay. An emotion classifier was assessed using the publicly available database for an emotion recognition system based on EEG signals and various computer games (GAMEEMO). The signal processing method, referred to as Regression EEG (REGEEG), involves an efficient electrode pairing selector developed for EEG signal processing using a regression algorithm, rotation matrices, director vectors, and robust statistical and polynomial feature extraction. REGEEG and feature extraction methods were evaluated with 28 machine learning kernels, resulting in five kernels with classification performance above 80%, with the K-Nearest Neighbors (k-NN) based model outperforming the rest (achieving over 95% accuracy, F1-Score, and kappa-score). REGEEG performance was further validated using 30 Cross-Validation (CV) folders and 28 in the Leave-one Subject-out (LoSo) technique without impacting the average classification performance. The classification highlights revealed low variance in the CV, while the LoSo approach helped identify outliers in the GAMEEMO dataset. Furthermore, the EEG pair channels selector demonstrates superior performance in classification, indicating a correlation between features and each processed pair of channels.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/JBHI.2025.3543729 | DOI Listing |
Front Behav Neurosci
February 2025
Department of Cognitive Science, University of California, Irvine, CA, United States.
Emotional memories change over time, but the mechanisms supporting this change are not well understood. Sleep has been identified as one mechanism that supports memory consolidation, with sleep selectively benefitting negative emotional consolidation at the expense of neutral memories, with specific oscillatory events linked to this process. In contrast, the consolidation of neutral and positive memories, compared to negative memories, has been associated with increased vagally mediated heart rate variability (HRV) during wakefulness.
View Article and Find Full Text PDFBMC Psychol
March 2025
Institute of Behavioural Sciences, Semmelweis University, Budapest, Hungary.
Metacognition and facial emotional expressions both play a major role in human social interactions [1, 2] as inner narrative and primary communicational display, and both are limited by self-monitoring, control and their interaction with personal and social reference frames. The study aims to investigate how metacognitive abilities relate to facial emotional expressions, as the inner narrative of a subject might project subconsciously and primes facial emotional expressions in a non-social setting. Subjects were presented online to a set of digitalised short-term memory tasks and attended a screening of artistic and artificial stimuli, where their facial emotional expressions were recorded and analyzed by artificial intelligence.
View Article and Find Full Text PDFBMC Psychol
March 2025
Swiss Federal University for Vocational Education and Training, Lausanne, Switzerland.
Background: According to the hypersensitivity hypothesis, highly emotionally intelligent individuals perceive emotion information at a lower threshold, pay more attention to emotion information, and may be characterized by more intense emotional experiences. The goal of the present study was to investigate whether and how emotional intelligence (EI) is related to hypersensitivity operationalized as heightened emotional and facial reactions when observing others narrating positive and negative life experiences.
Methods: Participants (144 women) watched positive and negative videos in three different conditions: with no specific instructions (spontaneous condition), with the instructions to put themselves in the character's shoes (empathic condition) and with the instructions to distinguish themselves from the character (distancing condition).
BMC Womens Health
March 2025
Department of Health Policy, Planning and Management, Makerere University School of Public Health, Kampala, Uganda.
Background: The low use of self-injectable contraception, coupled with the recognition that many individuals need support beyond training to use self-care technologies successfully, suggests the need for innovative programming. We describe the participatory human-centered design process we used in two districts of Uganda to develop a community-based peer support intervention to improve women's agency to make and act on contraceptive decisions and help diffuse self-injectable contraception.
Methods: The design team included multi-disciplinary researchers from Uganda and the United States, representatives of local community-based organizations and village health teams, and local women of reproductive age.
Sci Rep
March 2025
Department of Communication Sciences and Disorders, Saint Mary's College, Notre Dame, IN, USA.
Speech emotion recognition (SER) is an important application in Affective Computing and Artificial Intelligence. Recently, there has been a significant interest in Deep Neural Networks using speech spectrograms. As the two-dimensional representation of the spectrogram includes more speech characteristics, research interest in convolution neural networks (CNNs) or advanced image recognition models is leveraged to learn deep patterns in a spectrogram to effectively perform SER.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!