The medial entorhinal cortex encodes multisensory spatial information.

bioRxiv

Spatial Navigation and Memory Unit, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, MD 20892, USA.

Published: January 2024

Animals employ spatial information in multisensory modalities to navigate their natural environments. However, it is unclear whether the brain encodes such information in separate cognitive maps or integrates all into a single, universal map. We addressed this question in the microcircuit of the medial entorhinal cortex (MEC), a cognitive map of space. Using cellular-resolution calcium imaging, we examined the MEC of mice navigating virtual reality tracks, where visual and auditory cues provided comparable spatial information. We uncovered two cell types: "unimodality cells" and "multimodality cells". The unimodality cells specifically represent either auditory or visual spatial information. They are anatomically intermingled and maintain sensory preferences across multiple tracks and behavioral states. The multimodality cells respond to both sensory modalities with their responses shaped differentially by auditory and visual information. Thus, the MEC enables accurate spatial encoding during multisensory navigation by computing spatial information in different sensory modalities and generating distinct maps.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10836072PMC
http://dx.doi.org/10.1101/2024.01.09.574924DOI Listing

Publication Analysis

Top Keywords

medial entorhinal
8
entorhinal cortex
8
auditory visual
8
sensory modalities
8
spatial
6
cortex encodes
4
encodes multisensory
4
multisensory spatial
4
spatial animals
4
animals employ
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!