Introduction: Deficits in emotional perception are common in autistic people, but it remains unclear to which extent these perceptual impairments are linked to specific sensory modalities, specific emotions or multisensory facilitation.

Methods: This study aimed to investigate uni- and bimodal perception of emotional cues as well as multisensory facilitation in autistic ( = 18, mean age: 36.72 years, SD: 11.36) compared to non-autistic ( = 18, mean age: 36.41 years, SD: 12.18) people using auditory, visual and audiovisual stimuli.

Results: Lower identification accuracy and longer response time were revealed in high-functioning autistic people. These differences were independent of modality and emotion and showed large effect sizes (Cohen's 0.8-1.2). Furthermore, multisensory facilitation of response time was observed in non-autistic people that was absent in autistic people, whereas no differences were found in multisensory facilitation of accuracy between the two groups.

Discussion: These findings suggest that processing of auditory and visual components of audiovisual stimuli is carried out more separately in autistic individuals (with equivalent temporal demands required for processing of the respective unimodal cues), but still with similar relative improvement in accuracy, whereas earlier integrative multimodal merging of stimulus properties seems to occur in non-autistic individuals.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10165112PMC
http://dx.doi.org/10.3389/fpsyt.2023.1151665DOI Listing

Publication Analysis

Top Keywords

multisensory facilitation
16
response time
12
autistic people
12
facilitation response
8
 = 18 age
8
auditory visual
8
people differences
8
multisensory
5
autistic
5
people
5

Similar Publications

The integration and interaction of cross-modal senses in brain neural networks can facilitate high-level cognitive functionalities. In this work, we proposed a bioinspired multisensory integration neural network (MINN) that integrates visual and audio senses for recognizing multimodal information across different sensory modalities. This deep learning-based model incorporates a cascading framework of parallel convolutional neural networks (CNNs) for extracting intrinsic features from visual and audio inputs, and a recurrent neural network (RNN) for multimodal information integration and interaction.

View Article and Find Full Text PDF

Hydrogel-based flexible electronic components have become the optimal solution to address the rigidity problem of traditional electronics in health management. In this study, a multipurpose hydrogel is introduced, which is formed by combining a dual-network consisting of physical (chitosan, polyvinyl alcohol (PVA)) and chemical (poly(isopropyl acrylamide (NIPAM)-co-acrylamide (AM))) cross-linking, along with signal conversion fillers (eutectic gallium indium (EGaIn), TiC MXene, polyaniline (PANI)) for responding to external stimuli. Multiple sensing of dynamic and static signals is permissible for it.

View Article and Find Full Text PDF

Decoding visual and auditory stimuli from brain activities, such as electroencephalography (EEG), offers promising advancements for enhancing machine-to-human interaction. However, effectively representing EEG signals remains a significant challenge. In this paper, we introduce a novel Delayed Knowledge Transfer (DKT) framework that employs spiking neurons for attention detection, using our experimental EEG dataset.

View Article and Find Full Text PDF

The multisensory control of sequential actions.

Exp Brain Res

December 2024

Department of Medical and Translational Biology, Umeå University, S-901 87, Umeå, Sweden.

Many motor tasks are comprised of sequentially linked action phases, as when reaching for, lifting, transporting, and replacing a cup of coffee. During such tasks, discrete visual, auditory and/or haptic feedback are typically associated with mechanical events at the completion of each action phase, as when breaking and subsequently making contact between the cup and the table. An emerging concept is that important sensorimotor control operations, that affect subsequent action phases, are centred on these discrete multisensory events.

View Article and Find Full Text PDF

Multisensory integration of social signals by a pathway from the basal amygdala to the auditory cortex in maternal mice.

Curr Biol

November 2024

Cold Spring Harbor Laboratory, 1 Bungtown Road, Cold Spring Harbor, NY 11724, USA. Electronic address:

Social encounters are inherently multisensory events, yet how and where social cues of distinct sensory modalities merge and interact in the brain is poorly understood. When their pups wander away from the nest, mother mice use a combination of vocal and olfactory signals emitted by the pups to locate and retrieve them. Previous work revealed the emergence of multisensory interactions in the auditory cortex (AC) of both dams and virgins who cohabitate with pups ("surrogates").

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!