Metacognition and facial emotional expressions both play a major role in human social interactions [1, 2] as inner narrative and primary communicational display, and both are limited by self-monitoring, control and their interaction with personal and social reference frames. The study aims to investigate how metacognitive abilities relate to facial emotional expressions, as the inner narrative of a subject might project subconsciously and primes facial emotional expressions in a non-social setting. Subjects were presented online to a set of digitalised short-term memory tasks and attended a screening of artistic and artificial stimuli, where their facial emotional expressions were recorded and analyzed by artificial intelligence. Results show self-assessment bias in association with emotional expressivity - neutrality, saturation, transparency - and the display of anger and hostility as an individually specific trait expressed at modality-dependent degrees. Our results indicate that self-assessment bias interplays in subconscious communication - the expression, control and recognition of facial emotions, especially - with empathetic skills and manipulation.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1186/s40359-025-02590-7 | DOI Listing |
Cureus
February 2025
1st Department of Anesthesiology and Pain Medicine, Aretaieion University Hospital, National and Kapodistrian University of Athens, Athens, GRC.
Introduction Trigeminal neuralgia (TN) is a chronic condition characterized by sudden, short episodes of excruciating facial pain affecting one or more branches of the trigeminal nerve (V), which severely impacts patients' quality of life. Despite the availability of various treatment options, some cases experience poor pain control. Sphenopalatine ganglion (SPG) block using the Tx360 nasal applicator has recently been introduced with promising results as a treatment option in such cases.
View Article and Find Full Text PDFBMC Psychol
March 2025
Institute of Behavioural Sciences, Semmelweis University, Budapest, Hungary.
Metacognition and facial emotional expressions both play a major role in human social interactions [1, 2] as inner narrative and primary communicational display, and both are limited by self-monitoring, control and their interaction with personal and social reference frames. The study aims to investigate how metacognitive abilities relate to facial emotional expressions, as the inner narrative of a subject might project subconsciously and primes facial emotional expressions in a non-social setting. Subjects were presented online to a set of digitalised short-term memory tasks and attended a screening of artistic and artificial stimuli, where their facial emotional expressions were recorded and analyzed by artificial intelligence.
View Article and Find Full Text PDFBMC Psychol
March 2025
Swiss Federal University for Vocational Education and Training, Lausanne, Switzerland.
Background: According to the hypersensitivity hypothesis, highly emotionally intelligent individuals perceive emotion information at a lower threshold, pay more attention to emotion information, and may be characterized by more intense emotional experiences. The goal of the present study was to investigate whether and how emotional intelligence (EI) is related to hypersensitivity operationalized as heightened emotional and facial reactions when observing others narrating positive and negative life experiences.
Methods: Participants (144 women) watched positive and negative videos in three different conditions: with no specific instructions (spontaneous condition), with the instructions to put themselves in the character's shoes (empathic condition) and with the instructions to distinguish themselves from the character (distancing condition).
Sci Rep
March 2025
Department of Communication Sciences and Disorders, Saint Mary's College, Notre Dame, IN, USA.
Speech emotion recognition (SER) is an important application in Affective Computing and Artificial Intelligence. Recently, there has been a significant interest in Deep Neural Networks using speech spectrograms. As the two-dimensional representation of the spectrogram includes more speech characteristics, research interest in convolution neural networks (CNNs) or advanced image recognition models is leveraged to learn deep patterns in a spectrogram to effectively perform SER.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!