Background: Recognising emotions in humans is a great challenge in the present era and has several applications under affective computing. Deep learning (DL) is found as a successful tool for prediction of human emotions in different modalities.

Objective: To predict 3D emotions with high accuracy in multichannel physiological signals, i.e. electroencephalogram (EEG).

Methods: A hybrid DL model consisting of convolutional neural network (CNN) and gated recurrent units (GRU) is proposed in this work for emotion recognition in EEG data. CNN has the capability of learning abstract representation, whereas GRU can explore temporal correlation. A bi-directional variation of GRU is used here to learn features in both directions. Discrete and dimensional emotion indices are recognised in two publicly available datasets SEED and DREAMER, respectively. A fused feature of energy and Shannon entropy (𝐸𝑛𝑆𝐸→) and energy and differential entropy (𝐸𝑛𝐷𝐸→) are fed in the proposed classifier to improve the efficiency of the model.

Results: The performance of the presented model is measured in terms of average accuracy, which is obtained as 86.9% and 93.9% for SEED and DREAMER datasets, respectively.

Conclusion: The proposed convolution bi-directional gated recurrent unit neural network (CNN-BiGRU) model outperforms most of the state-of-the-art and competitive hybrid DL models, which indicates the effectiveness of emotion recognition using EEG signals and provides a scientific base for the implementation in human-computer interaction (HCI).

Download full-text PDF

Source
http://dx.doi.org/10.3233/THC-220458DOI Listing

Publication Analysis

Top Keywords

gated recurrent
12
neural network
12
emotion recognition
12
convolution bi-directional
8
bi-directional gated
8
recurrent unit
8
unit neural
8
recognition eeg
8
seed dreamer
8
novel convolution
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!