Publications by authors named "Taro Toyoizumi"

A crucial challenge in targeted manipulation of neural activity is to identify perturbation sites whose stimulation exerts significant effects downstream with high efficacy, a procedure currently achieved by labor-intensive and potentially harmful trial and error. Can one predict the effects of electrical stimulation on neural activity based on the circuit dynamics during spontaneous periods? Here we show that the effects of single-site micro-stimulation on ensemble activity in an alert monkey's prefrontal cortex can be predicted solely based on the ensemble's spontaneous activity. We first inferred the ensemble's causal flow based on the directed functional interactions inferred during spontaneous periods using convergent cross-mapping and showed that it uncovers a causal hierarchy between the recording electrodes.

View Article and Find Full Text PDF

Sleep is regulated by homeostatic processes, yet the biological basis of sleep pressure that accumulates during wakefulness, triggers sleep, and dissipates during sleep remains elusive. We explored a causal relationship between cellular synaptic strength and electroencephalography delta power indicating macro-level sleep pressure by developing a theoretical framework and a molecular tool to manipulate synaptic strength. The mathematical model predicted that increased synaptic strength promotes the neuronal "down state" and raises the delta power.

View Article and Find Full Text PDF

Cortical neurons exhibit highly variable responses over trials and time. Theoretical works posit that this variability arises potentially from chaotic network dynamics of recurrently connected neurons. Here, we demonstrate that chaotic neural dynamics, formed through synaptic learning, allow networks to perform sensory cue integration in a sampling-based implementation.

View Article and Find Full Text PDF

The hippocampal subfield CA3 is thought to function as an auto-associative network that stores experiences as memories. Information from these experiences arrives directly from the entorhinal cortex as well as indirectly through the dentate gyrus, which performs sparsification and decorrelation. The computational purpose for these dual input pathways has not been firmly established.

View Article and Find Full Text PDF

There are many modeling works that aim to explain people's behaviors that violate classical economic theories. However, these models often do not take into full account the multi-stage nature of real-life problems and people's tendency in solving complicated problems sequentially. In this work, we propose a descriptive decision-making model for multi-stage problems with perceived post-decision information.

View Article and Find Full Text PDF

We present a Hopfield-like autoassociative network for memories representing examples of concepts. Each memory is encoded by two activity patterns with complementary properties. The first is dense and correlated across examples within concepts, and the second is sparse and exhibits no correlation among examples.

View Article and Find Full Text PDF

Sleep is considered to play an essential role in memory reorganization. Despite its importance, classical theoretical models did not focus on some sleep characteristics. Here, we review recent theoretical approaches investigating their roles in learning and discuss the possibility that non-rapid eye movement (NREM) sleep selectively consolidates memory, and rapid eye movement (REM) sleep reorganizes the representations of memories.

View Article and Find Full Text PDF

Slow waves during the non-rapid eye movement (NREM) sleep reflect the alternating up and down states of cortical neurons; global and local slow waves promote memory consolidation and forgetting, respectively. Furthermore, distinct spike-timing-dependent plasticity (STDP) operates in these up and down states. The contribution of different plasticity rules to neural information coding and memory reorganization remains unknown.

View Article and Find Full Text PDF
Article Synopsis
  • The study explores how repetitive high-frequency visual stimulation (H-RVS) impacts visual perceptual learning (VPL) during orientation detection tasks.
  • After one training session, H-RVS improved performance, but after seven sessions, it actually worsened performance compared to no stimulation (sham condition).
  • The results indicate that increased brain network activity may sometimes reduce overall performance instead of enhancing it, suggesting a complex relationship in posttraining processing.
View Article and Find Full Text PDF

A deep neural network is a good task solver, but it is difficult to make sense of its operation. People have different ideas about how to interpret its operation. We look at this problem from a new perspective where the interpretation of task solving is synthesized by quantifying how much and what previously unused information is exploited in addition to the information used to solve previous tasks.

View Article and Find Full Text PDF

For many years, a combination of principal component analysis (PCA) and independent component analysis (ICA) has been used for blind source separation (BSS). However, it remains unclear why these linear methods work well with real-world data that involve nonlinear source mixtures. This work theoretically validates that a cascade of linear PCA and ICA can solve a nonlinear BSS problem accurately-when the sensory inputs are generated from hidden sources via nonlinear mappings with sufficient dimensionality.

View Article and Find Full Text PDF

Conventional theories assume that long-term information storage in the brain is implemented by modifying synaptic efficacy. Recent experimental findings challenge this view by demonstrating that dendritic spine sizes, or their corresponding synaptic weights, are highly volatile even in the absence of neural activity. Here, we review previous computational works on the roles of these intrinsic synaptic dynamics.

View Article and Find Full Text PDF

In the brain, most synapses are formed on minute protrusions known as dendritic spines. Unlike their artificial intelligence counterparts, spines are not merely tuneable memory elements: they also embody algorithms that implement the brain's ability to learn from experience and cope with new challenges. Importantly, they exhibit structural dynamics that depend on activity, excitatory input and inhibitory input (synaptic plasticity or 'extrinsic' dynamics) and dynamics independent of activity ('intrinsic' dynamics), both of which are subject to neuromodulatory influences and reinforcers such as dopamine.

View Article and Find Full Text PDF

Traveling waves are commonly observed across the brain. While previous studies have suggested the role of traveling waves in learning, the mechanism remains unclear. We adopted a computational approach to investigate the effect of traveling waves on synaptic plasticity.

View Article and Find Full Text PDF

We propose an analytically tractable neural connectivity model with power-law distributed synaptic strengths. When threshold neurons with biologically plausible number of incoming connections are considered, our model features a continuous transition to chaos and can reproduce biologically relevant low activity levels and scale-free avalanches, i.e.

View Article and Find Full Text PDF

A new model of search based on stochastic resetting is introduced, wherein rate of resets depends explicitly on time elapsed since the beginning of the process. It is shown that rate inversely proportional to time leads to paradoxical diffusion which mixes self-similarity and linear growth of the mean-square displacement with nonlocality and non-Gaussian propagator. It is argued that such resetting protocol offers a general and efficient search-boosting method that does not need to be optimized with respect to the scale of the underlying search problem (e.

View Article and Find Full Text PDF

Sense of agency (SoA) refers to the experience or belief that one's own actions caused an external event. Here we present a model of SoA in the framework of optimal Bayesian cue integration with mutually involved principles, namely reliability of action and outcome sensory signals, their consistency with the causation of the outcome by the action, and the prior belief in causation. We used our Bayesian model to explain the intentional binding effect, which is regarded as a reliable indicator of SoA.

View Article and Find Full Text PDF

It is often assumed that Hebbian synaptic plasticity forms a cell assembly, a mutually interacting group of neurons that encodes memory. However, in recurrently connected networks with pure Hebbian plasticity, cell assemblies typically diverge or fade under ongoing changes of synaptic strength. Previously assumed mechanisms that stabilize cell assemblies do not robustly reproduce the experimentally reported unimodal and long-tailed distribution of synaptic strengths.

View Article and Find Full Text PDF

Animals need to adjust their inferences according to the context they are in. This is required for the multi-context blind source separation (BSS) task, where an agent needs to infer hidden sources from their context-dependent mixtures. The agent is expected to invert this mixing process for all contexts.

View Article and Find Full Text PDF