Use of Differential Entropy for Automated Emotion Recognition in a Virtual Reality Environment with EEG Signals.

Diagnostics (Basel)

Department of Electronics and Computer Engineering, Ngee Ann Polytechnic, Singapore 599489, Singapore.

Published: October 2022

Emotion recognition is one of the most important issues in human-computer interaction (HCI), neuroscience, and psychology fields. It is generally accepted that emotion recognition with neural data such as electroencephalography (EEG) signals, functional magnetic resonance imaging (fMRI), and near-infrared spectroscopy (NIRS) is better than other emotion detection methods such as speech, mimics, body language, facial expressions, etc., in terms of reliability and accuracy. In particular, EEG signals are bioelectrical signals that are frequently used because of the many advantages they offer in the field of emotion recognition. This study proposes an improved approach for EEG-based emotion recognition on a publicly available newly published dataset, VREED. Differential entropy (DE) features were extracted from four wavebands (theta 4-8 Hz, alpha 8-13 Hz, beta 13-30 Hz, and gamma 30-49 Hz) to classify two emotional states (positive/negative). Five classifiers, namely Support Vector Machine (SVM), k-Nearest Neighbor (kNN), Naïve Bayesian (NB), Decision Tree (DT), and Logistic Regression (LR) were employed with DE features for the automated classification of two emotional states. In this work, we obtained the best average accuracy of 76.22% ± 2.06 with the SVM classifier in the classification of two states. Moreover, we observed from the results that the highest average accuracy score was produced with the gamma band, as previously reported in studies in EEG-based emotion recognition.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9601226PMC
http://dx.doi.org/10.3390/diagnostics12102508DOI Listing

Publication Analysis

Top Keywords

emotion recognition
24
eeg signals
12
differential entropy
8
eeg-based emotion
8
emotional states
8
average accuracy
8
emotion
7
recognition
6
entropy automated
4
automated emotion
4

Similar Publications

Background: Facial emotion recognition is one of the significant domains of social cognition that underlie social interactions. These deficits can influence the functional outcome in individuals with schizophrenia by impairing judgment toward others and reducing their capability to function. We aimed to assess the facial emotion recognition deficits in individuals with schizophrenia in comparison to healthy individuals and find their association with clinical and demographic profiles.

View Article and Find Full Text PDF

Frequently, we perceive emotional information through multiple channels (e.g., face, voice, posture).

View Article and Find Full Text PDF

Background: The prevalence of mental health disorders among children and adolescents presents a significant public health challenge. Children exposed to armed conflicts are at a particularly high risk of developing mental health problems, necessitating prompt and robust intervention. The acute need for early intervention in these situations is well recognized, as timely support can mitigate long-term negative outcomes.

View Article and Find Full Text PDF

Development and Validation of the Invisibility Scale.

Scand J Psychol

January 2025

Department of Psychology and Behavioural Sciences, Aarhus University, Aarhus, Denmark.

The concept of social invisibility describes the devaluation of the perceived social and personal worth of an individual. This paper presents the theoretical foundation for this construct, and the development and validation of the "Invisibility Scale" capturing experiences of and needs for social (in)visibility within (i) intimate, (ii) legal, and (iii) communal relations. We developed and validated the Invisibility Scale in two studies.

View Article and Find Full Text PDF

People routinely use facial expressions to communicate successfully and to regulate other's behaviour, yet modelling the form and meaning of these facial behaviours has proven surprisingly complex. One reason for this difficulty may lie in an over-reliance on the assumptions inherent in existing theories of facial expression - specifically that (1) there is a putative set of facial expressions that signal an internal state of emotion, (2) patterns of facial movement have been empirically linked to the prototypical emotions in this set, and (3) static, non-social, posed images from convenience samples are adequate to validate the first two assumptions. These assumptions have guided the creation of datasets, which are then used to train unrepresentative computational models of facial expression.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!