Publications by authors named "Atul Gopal"

Fast movements like saccadic eye movements that occur in the absence of sensory feedback are thought to be controlled by internal feedback. Such internal feedback provides an instantaneous estimate of the output, which serves as a proxy for sensory feedback, that can be used by the controller to correct deviations from the desired plan. In the predominant view, the desired plan/input is encoded in the form of a static displacement signal (endpoint model), believed to be encoded in the spatial map of the superior colliculus (SC).

View Article and Find Full Text PDF

Significant progress has been made in understanding the computational and neural mechanisms that mediate eye and hand movements made in isolation. However, less is known about the mechanisms that control these movements when they are coordinated. Here, we outline our computational approaches using accumulation-to-threshold and race-to-threshold models to elucidate the mechanisms that initiate and inhibit these movements.

View Article and Find Full Text PDF

The superior colliculus (SC) is an important structure in the mammalian brain that orients the animal toward distinct visual events. Visually responsive neurons in SC are modulated by visual object features, including size, motion, and color. However, it remains unclear whether SC activity is modulated by non-visual object features, such as the reward value associated with the object.

View Article and Find Full Text PDF

Direct and indirect pathways in the basal ganglia work together for controlling behavior. However, it is still a controversial topic whether these pathways are segregated or merged with each other. To address this issue, we studied the connections of these two pathways in the caudal parts of the basal ganglia of rhesus monkeys using anatomical tracers.

View Article and Find Full Text PDF

Although race models have been extensively used to study inhibitory control, the mechanisms that enable change of reach plans in the context of race models remain unexplored. We used a redirect task in which targets occasionally changed their locations to study the control of reaching movements during movement planning and execution phases. We tested nine different race model architectures that could explain the redirect behavior of reaching movements.

View Article and Find Full Text PDF

In contrast to hand movements, the existence of a neural representation of saccade kinematics is unclear. Saccade kinematics is typically thought to be specified by motor error/desired displacement and generated by brain stem circuits that are not penetrable to voluntary control. We studied the influence of instructed hand movement velocity on the kinematics of saccades executed without explicit instructions.

View Article and Find Full Text PDF

Unlabelled: Eye and hand movements are initiated by anatomically separate regions in the brain, and yet these movements can be flexibly coupled and decoupled, depending on the need. The computational architecture that enables this flexible coupling of independent effectors is not understood. Here, we studied the computational architecture that enables flexible eye-hand coordination using a drift diffusion framework, which predicts that the variability of the reaction time (RT) distribution scales with its mean.

View Article and Find Full Text PDF

Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials.

View Article and Find Full Text PDF

Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures.

View Article and Find Full Text PDF

The computational architecture that enables the flexible coupling between otherwise independent eye and hand effector systems is not understood. By using a drift diffusion framework, in which variability of the reaction time (RT) distribution scales with mean RT, we tested the ability of a common stochastic accumulator to explain eye-hand coordination. Using a combination of behavior, computational modeling and electromyography, we show how a single stochastic accumulator to threshold, followed by noisy effector-dependent delays, explains eye-hand RT distributions and their correlation, while an alternate independent, interactive eye and hand accumulator model does not.

View Article and Find Full Text PDF