Animals employ spatial information in multisensory modalities to navigate their natural environments. However, it is unclear whether the brain encodes such information in separate cognitive maps or integrates it all into a single, universal map. We address this question in the microcircuit of the medial entorhinal cortex (MEC), a cognitive map of space. Using cellular-resolution calcium imaging, we examine the MEC of mice navigating virtual reality tracks, where visual and auditory cues provide comparable spatial information. We uncover two cell types: "unimodality cells" and "multimodality cells." The unimodality cells specifically represent either auditory or visual spatial information. They are anatomically intermingled and maintain sensory preferences across multiple tracks and behavioral states. The multimodality cells respond to both sensory modalities, with their responses shaped differentially by auditory or visual information. Thus, the MEC enables accurate spatial encoding during multisensory navigation by computing spatial information in different sensory modalities and generating distinct maps.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11539853 | PMC |
http://dx.doi.org/10.1016/j.celrep.2024.114813 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!