Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Emotions play a crucial role in human thoughts, cognitive processes, and decision-making. EEG has become a widely utilized tool in emotion recognition due to its high temporal resolution, real-time monitoring capabilities, portability, and cost-effectiveness. In this paper, we propose a novel end-to-end emotion recognition method from EEG signals, called MSDCGTNet, which is based on the Multi-Scale Dynamic 1D CNN and the Gated Transformer. First, the Multi-Scale Dynamic CNN is used to extract complex spatial and spectral features from raw EEG signals, which not only avoids information loss but also reduces computational costs associated with the time-frequency conversion of signals. Then, the Gated Transformer Encoder is utilized to capture global dependencies of EEG signals. This encoder focuses on specific regions of the input sequence while reducing computational resources through parallel processing with the improved multi-head self-attention mechanisms. Third, the Temporal Convolution Network is used to extract temporal features from the EEG signals. Finally, the extracted abstract features are fed into a classification module for emotion recognition. The proposed method was evaluated on three publicly available datasets: DEAP, SEED, and SEED_IV. Experimental results demonstrate the high accuracy and efficiency of the proposed method for emotion recognition. This approach proves to be robust and suitable for various practical applications. By addressing challenges posed by existing methods, the proposed method provides a valuable and effective solution for the field of Brain-Computer Interface (BCI).
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1038/s41598-024-82705-z | DOI Listing |
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11682401 | PMC |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!