Nonlinear fusion is optimal for a wide class of multisensory tasks.

PLoS Comput Biol

Department of Electrical and Electronic Engineering, Imperial College London, London, United Kingdom.

Published: July 2024

Animals continuously detect information via multiple sensory channels, like vision and hearing, and integrate these signals to realise faster and more accurate decisions; a fundamental neural computation known as multisensory integration. A widespread view of this process is that multimodal neurons linearly fuse information across sensory channels. However, does linear fusion generalise beyond the classical tasks used to explore multisensory integration? Here, we develop novel multisensory tasks, which focus on the underlying statistical relationships between channels, and deploy models at three levels of abstraction: from probabilistic ideal observers to artificial and spiking neural networks. Using these models, we demonstrate that when the information provided by different channels is not independent, linear fusion performs sub-optimally and even fails in extreme cases. This leads us to propose a simple nonlinear algorithm for multisensory integration which is compatible with our current knowledge of multimodal circuits, excels in naturalistic settings and is optimal for a wide class of multisensory tasks. Thus, our work emphasises the role of nonlinear fusion in multisensory integration, and provides testable hypotheses for the field to explore at multiple levels: from single neurons to behaviour.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11253934PMC
http://dx.doi.org/10.1371/journal.pcbi.1012246DOI Listing

Publication Analysis

Top Keywords

multisensory tasks
12
multisensory integration
12
nonlinear fusion
8
optimal wide
8
wide class
8
class multisensory
8
sensory channels
8
linear fusion
8
multisensory
7
fusion optimal
4

Similar Publications

Women show enhanced proprioceptive target estimation through visual-proprioceptive conflict resolution.

Front Psychol

December 2024

Departamento de Psicologia, Laboratório de Neurociência do Comportamento, Pontifícia Universidade Católica do Rio de Janeiro, Rio de Janeiro, Brazil.

To form a unified and coherent perception of the organism's state and its relationship with the surrounding environment, the nervous system combines information from various sensory modalities through multisensory integration processes. Occasionally, data from two or more sensory channels may provide conflicting information. This is particularly evident in experiments using the mirror-guided drawing task and the mirror-box illusion, where there is conflict between positional estimates guided by vision and proprioception.

View Article and Find Full Text PDF

Multimodal MRI Analysis of Microstructural and Functional Connectivity Brain Changes Following Systematic Audio-Visual Training in a Virtual Environment.

Neuroimage

December 2024

Institute of Population Health, University of Liverpool, United Kingdom; Hanse Wissenschaftskolleg, Delmenhorst, Germany. Electronic address:

Recent work has shown rapid microstructural brain changes in response to learning new tasks. These cognitive tasks tend to draw on multiple brain regions connected by white matter (WM) tracts. Therefore, behavioural performance change is likely to be the result of microstructural, functional activation, and connectivity changes in extended neural networks.

View Article and Find Full Text PDF

The integration and interaction of cross-modal senses in brain neural networks can facilitate high-level cognitive functionalities. In this work, we proposed a bioinspired multisensory integration neural network (MINN) that integrates visual and audio senses for recognizing multimodal information across different sensory modalities. This deep learning-based model incorporates a cascading framework of parallel convolutional neural networks (CNNs) for extracting intrinsic features from visual and audio inputs, and a recurrent neural network (RNN) for multimodal information integration and interaction.

View Article and Find Full Text PDF

Introduction: Dynamic modulation of grip occurs mainly within the major structures of the brain stem, in parallel with cortical control. This basic, but fundamental level of the brain, is robust to ill-formed feedback and to be useful, it may not require all the perceptual information of feedback we are consciously aware. This makes it viable candidate for using peripheral nerve stimulation (PNS), a form of tactile feedback that conveys intensity and location information of touch well but does not currently reproduce other qualities of natural touch.

View Article and Find Full Text PDF

Unlabelled: Limb amputation results in such devastating consequences as loss of motor and sensory functions and phantom limb pain (PLP). Neurostimulation-based approaches have been developed to treat this condition, which provide artificial somatosensory feedback such as peripheral nerve stimulation (PNS), spinal cord stimulation (SCS), and transcutaneous electrical nerve stimulation (TENS). Yet, the effectiveness of different neurostimulation methods has been rarely tested in the same participants.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!