AI Article Synopsis

  • Our ability to understand meaningful actions relies on both visual and auditory senses working together, but this can sometimes be disrupted in certain individuals or clinical groups.
  • A meta-analysis compiled data from 82 neuroimaging studies to explore how the brain processes audio-visual interactions, creating activation likelihood estimate (ALE) brain maps based on specific stimulus categories.
  • The findings identified key brain regions involved in multisensory processing related to different aspects, such as living vs. nonliving events, vocalizations vs. actions, emotional stimuli, and dynamic vs. static visuals, shedding light on how we represent knowledge and perception in the brain.

Article Abstract

Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7941256PMC
http://dx.doi.org/10.1093/texcom/tgab002DOI Listing

Publication Analysis

Top Keywords

audio-visual interaction
8
events involving
8
multisensory processing
8
neuroimaging studies
8
brain regions
8
audio-visual events
8
audio-visual
6
events
5
brain
5
meta-analyses support
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!