There is debate within the literature as to whether emotion dysregulation (ED) in Attention-Deficit Hyperactivity Disorder (ADHD) reflects deviant attentional mechanisms or atypical perceptual emotion processing. Previous reviews have reliably examined the nature of facial, but not vocal, emotion recognition accuracy in ADHD. The present meta-analysis quantified vocal emotion recognition (VER) accuracy scores in ADHD and controls using robust variance estimation, gathered from 21 published and unpublished papers. Additional moderator analyses were carried out to determine whether the nature of VER accuracy in ADHD varied depending on emotion type. Findings revealed a medium effect size for the presence of VER deficits in ADHD, and moderator analyses showed VER accuracy in ADHD did not differ due to emotion type. These results support the theories which implicate the role of attentional mechanisms in driving VER deficits in ADHD. However, there is insufficient data within the behavioural VER literature to support the presence of emotion processing atypicalities in ADHD. Future neuro-imaging research could explore the interaction between attention and emotion processing in ADHD, taking into consideration ADHD subtypes and comorbidities.

Download full-text PDF

Source
http://dx.doi.org/10.1080/02699931.2023.2258590DOI Listing

Publication Analysis

Top Keywords

vocal emotion
12
emotion recognition
12
emotion processing
12
accuracy adhd
12
ver accuracy
12
adhd
10
attention-deficit hyperactivity
8
hyperactivity disorder
8
emotion
8
attentional mechanisms
8

Similar Publications

Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication.

View Article and Find Full Text PDF

Relationship Between Anxiety Symptoms and Age-Related Differences in Tic Severity.

Neuropsychiatr Dis Treat

January 2025

Department of Psychiatry, Beijing Children's Hospital, Capital Medical University, National Center for Children's Health, Beijing, People's Republic of China.

Purpose: Tic disorders are neurodevelopmental disorders characterized by movements or vocalizations, often accompanied by anxiety symptoms. However, the relationships between tic severity, age, and anxiety symptoms remain unclear. Here, we investigated the association between tic severity and age and examined how anxiety symptoms might influence this relationship.

View Article and Find Full Text PDF

Intermodulation frequencies reveal common neural assemblies integrating facial and vocal fearful expressions.

Cortex

December 2024

Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne & Sion, Switzerland. Electronic address:

Effective social communication depends on the integration of emotional expressions coming from the face and the voice. Although there are consistent reports on how seeing and hearing emotion expressions can be automatically integrated, direct signatures of multisensory integration in the human brain remain elusive. Here we implemented a multi-input electroencephalographic (EEG) frequency tagging paradigm to investigate neural populations integrating facial and vocal fearful expressions.

View Article and Find Full Text PDF

The extraction and analysis of pitch underpin speech and music recognition, sound segregation, and other auditory tasks. Perceptually, pitch can be represented as a helix composed of two factors: height monotonically aligns with frequency, while chroma cyclically repeats at doubled frequencies. Although the early perceptual and neurophysiological mechanisms for extracting pitch from acoustic signals have been extensively investigated, the equally essential subsequent stages that bridge to high-level auditory cognition remain less well understood.

View Article and Find Full Text PDF

We perceive and understand others' emotional states from multisensory information such as facial expressions and vocal cues. However, such cues are not always available or clear. Can partial loss of visual cues affect multisensory emotion perception? In addition, the COVID-19 pandemic has led to the widespread use of face masks, which can reduce some facial cues used in emotion perception.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!