There is debate within the literature as to whether emotion dysregulation (ED) in Attention-Deficit Hyperactivity Disorder (ADHD) reflects deviant attentional mechanisms or atypical perceptual emotion processing. Previous reviews have reliably examined the nature of facial, but not vocal, emotion recognition accuracy in ADHD. The present meta-analysis quantified vocal emotion recognition (VER) accuracy scores in ADHD and controls using robust variance estimation, gathered from 21 published and unpublished papers. Additional moderator analyses were carried out to determine whether the nature of VER accuracy in ADHD varied depending on emotion type. Findings revealed a medium effect size for the presence of VER deficits in ADHD, and moderator analyses showed VER accuracy in ADHD did not differ due to emotion type. These results support the theories which implicate the role of attentional mechanisms in driving VER deficits in ADHD. However, there is insufficient data within the behavioural VER literature to support the presence of emotion processing atypicalities in ADHD. Future neuro-imaging research could explore the interaction between attention and emotion processing in ADHD, taking into consideration ADHD subtypes and comorbidities.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/02699931.2023.2258590 | DOI Listing |
Emotion
January 2025
Department of Psychology, Cognitive and Affective Neuroscience Unit, University of Zurich.
Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication.
View Article and Find Full Text PDFNeuropsychiatr Dis Treat
January 2025
Department of Psychiatry, Beijing Children's Hospital, Capital Medical University, National Center for Children's Health, Beijing, People's Republic of China.
Purpose: Tic disorders are neurodevelopmental disorders characterized by movements or vocalizations, often accompanied by anxiety symptoms. However, the relationships between tic severity, age, and anxiety symptoms remain unclear. Here, we investigated the association between tic severity and age and examined how anxiety symptoms might influence this relationship.
View Article and Find Full Text PDFCortex
December 2024
Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), Louvain Bionics Center, University of Louvain (UCLouvain), Louvain-la-Neuve, Belgium; School of Health Sciences, HES-SO Valais-Wallis, The Sense Innovation and Research Center, Lausanne & Sion, Switzerland. Electronic address:
Effective social communication depends on the integration of emotional expressions coming from the face and the voice. Although there are consistent reports on how seeing and hearing emotion expressions can be automatically integrated, direct signatures of multisensory integration in the human brain remain elusive. Here we implemented a multi-input electroencephalographic (EEG) frequency tagging paradigm to investigate neural populations integrating facial and vocal fearful expressions.
View Article and Find Full Text PDFJ Neurosci
January 2025
Department of Psychology, Chinese University of Hong Kong, Hong Kong SAR, China
The extraction and analysis of pitch underpin speech and music recognition, sound segregation, and other auditory tasks. Perceptually, pitch can be represented as a helix composed of two factors: height monotonically aligns with frequency, while chroma cyclically repeats at doubled frequencies. Although the early perceptual and neurophysiological mechanisms for extracting pitch from acoustic signals have been extensively investigated, the equally essential subsequent stages that bridge to high-level auditory cognition remain less well understood.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Psychology, Tokyo Woman's Christian University, Tokyo, Japan.
We perceive and understand others' emotional states from multisensory information such as facial expressions and vocal cues. However, such cues are not always available or clear. Can partial loss of visual cues affect multisensory emotion perception? In addition, the COVID-19 pandemic has led to the widespread use of face masks, which can reduce some facial cues used in emotion perception.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!