Data reusability is an important feature of current research, just in every field of science. Modern research in Affective Computing, often rely on datasets containing experiments-originated data such as biosignals, video clips, or images. Moreover, conducting experiments with a vast number of participants to build datasets for Affective Computing research is time-consuming and expensive. Therefore, it is extremely important to provide solutions allowing one to (re)use data from a variety of sources, which usually demands data integration. This paper presents the Graph Representation Integrating Signals for Emotion Recognition and Analysis (GRISERA) framework, which provides a persistent model for storing integrated signals and methods for its creation. To the best of our knowledge, this is the first approach in Affective Computing field that addresses the problem of integrating data from multiple experiments, storing it in a consistent way, and providing query patterns for data retrieval. The proposed framework is based on the standardized graph model, which is known to be highly suitable for signal processing purposes. The validation proved that data from the well-known AMIGOS dataset can be stored in the GRISERA framework and later retrieved for training deep learning models. Furthermore, the second case study proved that it is possible to integrate signals from multiple sources (AMIGOS, ASCERTAIN, and DEAP) into GRISERA and retrieve them for further statistical analysis.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8230955 | PMC |
http://dx.doi.org/10.3390/s21124035 | DOI Listing |
Neuropsychopharmacology
January 2025
Neurocognition and Emotion in Affective Disorders (NEAD) Centre, Psychiatric Centre Copenhagen, Mental Health Services, Capital Region of Denmark, Frederiksberg, Denmark.
Individuals with bipolar disorder (BD) show heterogeneity in clinical, cognitive, and daily functioning characteristics, which challenges accurate diagnostics and optimal treatment. A key goal is to identify brain-based biomarkers that inform patient stratification and serve as treatment targets. The objective of the present study was to apply a data-driven, multivariate approach to quantify the relationship between multimodal imaging features and behavioral phenotypes in BD.
View Article and Find Full Text PDFEur Psychiatry
January 2025
Department of Affective Disorders, Aarhus University Hospital - Psychiatry, Aarhus, Denmark.
Sensors (Basel)
December 2024
Department of Biomedical Engineering, University of Connecticut, Storrs, CT 06269, USA.
The field of emotion recognition from physiological signals is a growing area of research with significant implications for both mental health monitoring and human-computer interaction. This study introduces a novel approach to detecting emotional states based on fractal analysis of electrodermal activity (EDA) signals. We employed detrended fluctuation analysis (DFA), Hurst exponent estimation, and wavelet entropy calculation to extract fractal features from EDA signals obtained from the CASE dataset, which contains physiological recordings and continuous emotion annotations from 30 participants.
View Article and Find Full Text PDFSensors (Basel)
December 2024
Instituto de Estudios de Género, Universidad Carlos III de Madrid, Calle Madrid, 126, 28903 Getafe, Spain.
Emotion recognition through artificial intelligence and smart sensing of physical and physiological signals (affective computing) is achieving very interesting results in terms of accuracy, inference times, and user-independent models. In this sense, there are applications related to the safety and well-being of people (sexual assaults, gender-based violence, children and elderly abuse, mental health, etc.) that require even more improvements.
View Article and Find Full Text PDFBrain Sci
December 2024
West China Institute of Children's Brain and Cognition, Chongqing University of Education, Chongqing 400065, China.
Background: Emotions play a crucial role in people's lives, profoundly affecting their cognition, decision-making, and interpersonal communication. Emotion recognition based on brain signals has become a significant challenge in the fields of affective computing and human-computer interaction.
Methods: Addressing the issue of inaccurate feature extraction and low accuracy of existing deep learning models in emotion recognition, this paper proposes a multi-channel automatic classification model for emotion EEG signals named DACB, which is based on dual attention mechanisms, convolutional neural networks, and bidirectional long short-term memory networks.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!