Integration of sensory evidence and reward expectation in mouse perceptual decision-making task with various sensory uncertainties.

iScience

Institute for Quantitative Biosciences, University of Tokyo, Laboratory of Neural Computation, 1-1-1 Yayoi, Bunkyo-ku, Tokyo 113-0032, Japan.

Published: August 2021

In perceptual decision-making, prior knowledge of action outcomes is essential, especially when sensory inputs are insufficient for proper choices. Signal detection theory (SDT) shows that optimal choice bias depends not only on the prior but also the sensory uncertainty; however, it is unclear how animals integrate sensory inputs with various uncertainties and reward expectations to optimize choices. We developed a tone-frequency discrimination task for head-fixed mice in which we randomly presented either a long or short sound stimulus and biased the choice outcomes. The choice was less accurate and more biased toward the large-reward side in short- than in long-stimulus trials. Analysis with SDT found that mice did not use a separate, optimal choice threshold in different sound durations. Instead, mice updated one threshold for short and long stimuli with a simple reinforcement-learning rule. Our task in head-fixed mice helps understanding how the brain integrates sensory inputs and prior.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8319806PMC
http://dx.doi.org/10.1016/j.isci.2021.102826DOI Listing

Publication Analysis

Top Keywords

sensory inputs
12
perceptual decision-making
8
optimal choice
8
task head-fixed
8
head-fixed mice
8
sensory
5
integration sensory
4
sensory evidence
4
evidence reward
4
reward expectation
4

Similar Publications

Toward the Bayesian brain: a generative model of information transmission by vestibular sensory neurons.

Front Neurol

December 2024

Department of Head and Neck Surgery and Brain Research Institute, David Geffen School of Medicine at UCLA, Los Angeles, CA, United States.

The relative accessibility and simplicity of vestibular sensing and vestibular-driven control of head and eye movements has made the vestibular system an attractive subject to experimenters and theoreticians interested in developing realistic quantitative models of how brains gather and interpret sense data and use it to guide behavior. Head stabilization and eye counter-rotation driven by vestibular sensory input in response to rotational perturbations represent natural, ecologically important behaviors that can be reproduced in the laboratory and analyzed using relatively simple mathematical models. Models drawn from dynamical systems and control theory have previously been used to analyze the behavior of vestibular sensory neurons.

View Article and Find Full Text PDF

A collicular map for touch-guided tongue control.

Nature

January 2025

Department of Neurobiology and Behavior, Cornell University, Ithaca, NY, USA.

Accurate goal-directed behaviour requires the sense of touch to be integrated with information about body position and ongoing motion. Behaviours such as chewing, swallowing and speech critically depend on precise tactile events on a rapidly moving tongue, but neural circuits for dynamic touch-guided tongue control are unknown. Here, using high-speed videography, we examined three-dimensional lingual kinematics as mice drank from a water spout that unexpectedly changed position during licking, requiring re-aiming in response to subtle contact events on the left, centre or right surface of the tongue.

View Article and Find Full Text PDF

Objective: To determine the effects of exercise on trunk performance and balance in patients with spinal cord injury (SCI).

Methods: We searched the databases MEDLINE, Cochrane Library, EMBASE, Physiotherapy Evidence Database, Web of Science, PsycINFO, and CINAHL from inception to June 2020. Our search targeted studies such as randomized or non-randomized controlled trials and randomized crossover trials that evaluated the effects of exercise on trunk performance and balance in patients with SCI.

View Article and Find Full Text PDF

Comparing auditory and visual aspects of multisensory working memory using bimodally matched feature patterns.

Exp Brain Res

December 2024

Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Massachusetts General Hospital, CNY 149, 13th St, Charlestown, MA, 02129, USA.

Working memory (WM) reflects the transient maintenance of information in the absence of external input, which can be attained via multiple senses separately or simultaneously. Pertaining to WM, the prevailing literature suggests the dominance of vision over other sensory systems. However, this imbalance may be stemming from challenges in finding comparable stimuli across modalities.

View Article and Find Full Text PDF

The correlational structure of brain activity dynamics in the absence of stimuli or behavior is often taken to reveal intrinsic properties of neural function. To test the limits of this assumption, we analyzed peripheral contributions to resting state activity measured by fMRI in unanesthetized, chemically immobilized male rats that emulate human neuroimaging conditions. We find that perturbation of somatosensory input channels modifies correlation strengths that relate somatosensory areas both to one another and to higher-order brain regions, despite the absence of ostensible stimuli or movements.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!