Neuromodulation of early multisensory interactions in the visual cortex.

J Cogn Neurosci

Department of Psychology, University of Milano-Bicocca, Piazza dellʼAteneo Nuovo 1, 20126 Milan, Italy.

Published: May 2013

Merging information derived from different sensory channels allows the brain to amplify minimal signals to reduce their ambiguity, thereby improving the ability of orienting to, detecting, and identifying environmental events. Although multisensory interactions have been mostly ascribed to the activity of higher-order heteromodal areas, multisensory convergence may arise even in primary sensory-specific areas located very early along the cortical processing stream. In three experiments, we investigated early multisensory interactions in lower-level visual areas, by using a novel approach, based on the coupling of behavioral stimulation with two noninvasive brain stimulation techniques, namely, TMS and transcranial direct current stimulation (tDCS). First, we showed that redundant multisensory stimuli can increase visual cortical excitability, as measured by means of phosphene induction by occipital TMS; such physiological enhancement is followed by a behavioral facilitation through the amplification of signal intensity in sensory-specific visual areas. The more sensory inputs are combined (i.e., trimodal vs. bimodal stimuli), the greater are the benefits on phosphene perception. Second, neuroelectrical activity changes induced by tDCS in the temporal and in the parietal cortices, but not in the occipital cortex, can further boost the multisensory enhancement of visual cortical excitability, by increasing the auditory and tactile inputs from temporal and parietal regions, respectively, to lower-level visual areas.

Download full-text PDF

Source
http://dx.doi.org/10.1162/jocn_a_00347DOI Listing

Publication Analysis

Top Keywords

multisensory interactions
12
visual areas
12
early multisensory
8
lower-level visual
8
visual cortical
8
cortical excitability
8
temporal parietal
8
multisensory
6
visual
6
areas
5

Similar Publications

Haptic Technology: Exploring Its Underexplored Clinical Applications-A Systematic Review.

Biomedicines

December 2024

Neuromodulation Center and Center for Clinical Research Learning, Spaulding Rehabilitation Hospital and Massachusetts General Hospital, Harvard Medical School, Boston, MA 02115, USA.

Background/objectives: Haptic technology has transformed interactions between humans and both tangible and virtual environments. Despite its widespread adoption across various industries, the potential therapeutic applications of this technology have yet to be fully explored.

Methods: A systematic review of randomized controlled trials (RCTs) and randomized crossover trials was conducted, utilizing databases such as PubMed, Embase, Cochrane Library, and Web of Science.

View Article and Find Full Text PDF
Article Synopsis
  • Family caregivers (FCGs) of cancer patients in hospice face psychological challenges and decreased quality of life due to caregiving demands, signaling a need for supportive interventions.
  • A virtual reality (VR) nature experience was implemented, allowing FCGs to immerse themselves in calming scenes at home, which they found to enhance relaxation and provide an escape from their caregiving stress.
  • Preliminary findings indicate that the VR intervention is feasible and acceptable, suggesting it can support the emotional health of hospice FCGs, though further research with larger and more diverse groups is necessary.
View Article and Find Full Text PDF

Introduction: Persistent postural-perceptual dizziness (PPPD) is the most prevalent chronic functional dizziness in the clinic. Unsteadiness, dizziness, or non-spinning vertigo are the main symptoms of PPPD, and they are typically aggravated by upright posture, active or passive movement, and visual stimulation. The pathogenesis of PPPD remains incompletely understood, and it cannot be attributed to any specific anatomical defect within the vestibular system.

View Article and Find Full Text PDF

The COVID-19 pandemic has highlighted the prevalence of fatigue, reduced interpersonal interaction, and heightened stress in work environments. The intersection of neuroscience and architecture underscores how intricate spatial perceptions are shaped by multisensory stimuli, profoundly influencing workers' wellbeing. In this study, EEG and VR technologies, specifically the , were employed to gather data on perception and cognition.

View Article and Find Full Text PDF

The integration and interaction of cross-modal senses in brain neural networks can facilitate high-level cognitive functionalities. In this work, we proposed a bioinspired multisensory integration neural network (MINN) that integrates visual and audio senses for recognizing multimodal information across different sensory modalities. This deep learning-based model incorporates a cascading framework of parallel convolutional neural networks (CNNs) for extracting intrinsic features from visual and audio inputs, and a recurrent neural network (RNN) for multimodal information integration and interaction.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!