Publications by authors named "Philip Sabes"

. The phase of the electroencephalographic (EEG) signal predicts performance in motor, somatosensory, and cognitive functions. Studies suggest that brain phase resets align neural oscillations with external stimuli, or couple oscillations across frequency bands and brain regions.

View Article and Find Full Text PDF

Recent studies have identified rotational dynamics in motor cortex (MC), which many assume arise from intrinsic connections in MC. However, behavioral and neurophysiological studies suggest that MC behaves like a feedback controller where continuous sensory feedback and interactions with other brain areas contribute substantially to MC processing. We investigated these apparently conflicting theories by building recurrent neural networks that controlled a model arm and received sensory feedback from the limb.

View Article and Find Full Text PDF
Article Synopsis
  • Optogenetics has transformed neuroscience research in small animals, but its effectiveness in non-human primates (NHPs) has shown mixed results.
  • * A centralized database has been created to help researchers track both successful and unsuccessful optogenetic experiments in primates, with contributions from 45 laboratories worldwide.
  • * The database, available on the Open Science Framework, aims to enhance research by sharing over 1,000 injection experiments and offers insights to improve optogenetic methods in NHPs.*
View Article and Find Full Text PDF

Stimulation of the cortex can modulate the connectivity between brain regions. Although targeted neuroplasticity has been demonstrated in-vitro, in-vivo models have been inconsistent in their response to stimulation. In this paper, we tested various stimulation protocols to characterize the effect of stimulation on coherence in the non-human primate cortex in-vivo.

View Article and Find Full Text PDF

In non-human primate (NHP) optogenetics, infecting large cortical areas with viral vectors is often a difficult and time-consuming task. Here, we demonstrate the use of magnetic resonance (MR)-guided convection enhanced delivery (CED) of optogenetic viral vectors into primary somatosensory (S1) and motor (M1) cortices of macaques to obtain efficient, widespread cortical expression of light-sensitive ion channels. Adeno-associated viral (AAV) vectors encoding the red-shifted opsin C1V1 fused to yellow fluorescent protein (EYFP) were injected into the cortex of rhesus macaques under MR-guided CED.

View Article and Find Full Text PDF

Optogenetics is a powerful tool that enables millisecond-level control of the activity of specific groups of neurons. Furthermore, it has the great advantage of artifact free recordings. These characteristics make this technique ideal for relating brain function to behavior in animals with great behavioral capabilities such as non-human primates (NHPs).

View Article and Find Full Text PDF

Brain stimulation modulates the excitability of neural circuits and drives neuroplasticity. While the local effects of stimulation have been an active area of investigation, the effects on large-scale networks remain largely unexplored. We studied stimulation-induced changes in network dynamics in two macaques.

View Article and Find Full Text PDF

Objective: The aim of this work is to improve the state of the art for motor-control with a brain-machine interface (BMI). BMIs use neurological recording devices and decoding algorithms to transform brain activity directly into real-time control of a machine, archetypically a robotic arm or a cursor. The standard procedure treats neural activity-vectors of spike counts in small temporal windows-as noisy observations of the kinematic state (position, velocity, acceleration) of the fingertip.

View Article and Find Full Text PDF

Background: In non-human primate (NHP) optogenetics, infecting large cortical areas with viral vectors is often a difficult and time-consuming task. Previous work has shown that parenchymal delivery of adeno-associated virus (AAV) in the thalamus by convection-enhanced delivery (CED) can lead to large-scale transduction via axonal transport in distal areas including cortex. We used this approach to obtain widespread cortical expression of light-sensitive ion channels.

View Article and Find Full Text PDF

Dorsal premotor (PMd) and primary motor (M1) cortices play a central role in mapping sensation to movement. Many studies of these areas have focused on correlation-based tuning curves relating neural activity to task or movement parameters, but the link between tuning and movement generation is unclear. We recorded motor preparatory activity from populations of neurons in PMd/M1 as macaque monkeys performed a visually guided reaching task and show that tuning curves for sensory inputs (reach target direction) and motor outputs (initial movement direction) are not typically aligned.

View Article and Find Full Text PDF

Naturalistic control of brain-machine interfaces will require artificial proprioception, potentially delivered via intracortical microstimulation (ICMS). We have previously shown that multi-channel ICMS can guide a monkey reaching to unseen targets in a planar workspace. Here, we expand on that work, asking how ICMS is decoded into target angle and distance by analyzing the performance of a monkey when ICMS feedback was degraded.

View Article and Find Full Text PDF

While optogenetics offers great potential for linking brain function and behavior in nonhuman primates, taking full advantage of that potential will require stable access for optical stimulation and concurrent monitoring of neural activity. Here we present a practical, stable interface for stimulation and recording of large-scale cortical circuits. To obtain optogenetic expression across a broad region, here spanning primary somatosensory (S1) and motor (M1) cortices, we used convection-enhanced delivery of the viral vector, with online guidance from MRI.

View Article and Find Full Text PDF

Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF), the parameters of which can be learned via latent-variable density estimation (the EM algorithm).

View Article and Find Full Text PDF

Proprioception-the sense of the body's position in space-is important to natural movement planning and execution and will likewise be necessary for successful motor prostheses and brain-machine interfaces (BMIs). Here we demonstrate that monkeys were able to learn to use an initially unfamiliar multichannel intracortical microstimulation signal, which provided continuous information about hand position relative to an unseen target, to complete accurate reaches. Furthermore, monkeys combined this artificial signal with vision to form an optimal, minimum-variance estimate of relative hand position.

View Article and Find Full Text PDF

Even well practiced movements cannot be repeated without variability. This variability is thought to reflect "noise" in movement preparation or execution. However, we show that, for both professional baseball pitchers and macaque monkeys making reaching movements, motor variability can be decomposed into two statistical components, a slowly drifting mean and fast trial-by-trial fluctuations about the mean.

View Article and Find Full Text PDF

Sensory processing in the brain includes three key operations: multisensory integration-the task of combining cues into a single estimate of a common underlying stimulus; coordinate transformations-the change of reference frame for a stimulus (e.g., retinotopic to body-centered) effected through knowledge about an intervening variable (e.

View Article and Find Full Text PDF

Although multisensory integration has been well modeled at the behavioral level, the link between these behavioral models and the underlying neural circuits is still not clear. This gap is even greater for the problem of sensory integration during movement planning and execution. The difficulty lies in applying simple models of sensory integration to the complex computations that are required for movement control and to the large networks of brain areas that perform these computations.

View Article and Find Full Text PDF

Most voluntary actions rely on neural circuits that map sensory cues onto appropriate motor responses. One might expect that for everyday movements, like reaching, this mapping would remain stable over time, at least in the absence of error feedback. Here we describe a simple and novel psychophysical phenomenon in which recent experience shapes the statistical properties of reaching, independent of any movement errors.

View Article and Find Full Text PDF

The planning and control of sensory-guided movements requires the integration of multiple sensory streams. Although the information conveyed by different sensory modalities is often overlapping, the shared information is represented differently across modalities during the early stages of cortical processing. We ask how these diverse sensory signals are represented in multimodal sensorimotor areas of cortex in macaque monkeys.

View Article and Find Full Text PDF

The sensory signals that drive movement planning arrive in a variety of 'reference frames', and integrating or comparing them requires sensory transformations. We propose a model in which the statistical properties of sensory signals and their transformations determine how these signals are used. This model incorporates the patterns of gaze-dependent errors that we found in our human psychophysics experiment when the sensory signals available for reach planning were varied.

View Article and Find Full Text PDF

The sensorimotor calibration of visually guided reaching changes on a trial-to-trial basis in response to random shifts in the visual feedback of the hand. We show that a simple linear dynamical system is sufficient to model the dynamics of this adaptive process. In this model, an internal variable represents the current state of sensorimotor calibration.

View Article and Find Full Text PDF

Recent studies have employed simple linear dynamical systems to model trial-by-trial dynamics in various sensorimotor learning tasks. Here we explore the theoretical and practical considerations that arise when employing the general class of linear dynamical systems (LDS) as a model for sensorimotor learning. In this framework, the state of the system is a set of parameters that define the current sensorimotor transformation-the function that maps sensory inputs to motor outputs.

View Article and Find Full Text PDF

When planning target-directed reaching movements, human subjects combine visual and proprioceptive feedback to form two estimates of the arm's position: one to plan the reach direction, and another to convert that direction into a motor command. These position estimates are based on the same sensory signals but rely on different combinations of visual and proprioceptive input, suggesting that the brain weights sensory inputs differently depending on the computation being performed. Here we show that the relative weighting of vision and proprioception depends both on the sensory modality of the target and on the information content of the visual feedback, and that these factors affect the two stages of planning independently.

View Article and Find Full Text PDF

When planning goal-directed reaches, subjects must estimate the position of the arm by integrating visual and proprioceptive signals from the sensory periphery. These integrated position estimates are required at two stages of motor planning: first to determine the desired movement vector, and second to transform the movement vector into a joint-based motor command. We quantified the contributions of each sensory modality to the position estimate formed at each planning stage.

View Article and Find Full Text PDF

When monkeys make saccadic eye movements to simple visual targets, neurons in the lateral intraparietal area (LIP) display a retinotopic, or eye-centered, coding of the target location. However natural saccadic eye movements are often directed at objects or parts of objects in the visual scene. In this paper we investigate whether LIP represents saccadic eye movements differently when the target is specified as part of a visually displayed object.

View Article and Find Full Text PDF