Parietal and frontal cortex are involved in saccade generation, and their output signals modify visual signals throughout cortex. Local signals associated with these interactions are well described, but their large-scale progression and network dynamics are unknown. Here, we combined source localized electroencephalography (EEG) and graph theory analysis (GTA) to understand how saccades and presaccadic visual stimuli interactively alter cortical network dynamics in humans.
View Article and Find Full Text PDFVarious models (e.g., scalar, state-dependent network, and vector models) have been proposed to explain the global aspects of time perception, but they have not been tested against specific visual phenomena like perisaccadic time compression and novel stimulus time dilation.
View Article and Find Full Text PDFEye-centered (egocentric) and landmark-centered (allocentric) visual signals influence spatial cognition, navigation, and goal-directed action, but the neural mechanisms that integrate these signals for motor control are poorly understood. A likely candidate for egocentric/allocentric integration in the gaze control system is the supplementary eye fields (SEF), a mediofrontal structure with high-level "executive" functions, spatially tuned visual/motor response fields, and reciprocal projections with the frontal eye fields (FEF). To test this hypothesis, we trained two head-unrestrained monkeys () to saccade toward a remembered visual target in the presence of a visual landmark that shifted during the delay, causing gaze end points to shift partially in the same direction.
View Article and Find Full Text PDFSensorimotor transformations require spatiotemporal coordination of signals, that is, through both time and space. For example, the gaze control system employs signals that are time-locked to various sensorimotor events, but the spatial content of these signals is difficult to assess during ordinary gaze shifts. In this review, we describe the various models and methods that have been devised to test this question, and their limitations.
View Article and Find Full Text PDFThe visual system is thought to separate egocentric and allocentric representations, but behavioral experiments show that these codes are optimally integrated to influence goal-directed movements. To test if frontal cortex participates in this integration, we recorded primate frontal eye field activity during a cue-conflict memory delay saccade task. To dissociate egocentric and allocentric coordinates, we surreptitiously shifted a visual landmark during the delay period, causing saccades to deviate by 37% in the same direction.
View Article and Find Full Text PDFGaze saccades, rapid shifts of the eyes and head toward a goal, have provided fundamental insights into the neural control of movement. For example, it has been shown that the superior colliculus (SC) transforms a visual target (T) code to future gaze (G) location commands after a memory delay. However, this transformation has not been observed in "reactive" saccades made directly to a stimulus, so its contribution to normal gaze behavior is unclear.
View Article and Find Full Text PDFNonhuman primates have been used extensively to study eye-head coordination and eye-hand coordination, but the combination-eye-head-hand coordination-has not been studied. Our goal was to determine whether reaching influences eye-head coordination (and vice versa) in rhesus macaques. Eye, head, and hand motion were recorded in two animals with search coil and touch screen technology, respectively.
View Article and Find Full Text PDFCervical dystonia (CD) is characterized by abnormal twisting and turning of the head with associated head oscillations. It is the most common form of dystonia, which is a third most common movement disorder. Despite frequent occurrence there is paucity in adequate therapy, much of which is attributed to its uncertain pathophysiology.
View Article and Find Full Text PDFThe memory-delay saccade task is often used to separate visual and motor responses in oculomotor structures such as the superior colliculus (SC), with the assumption that these same responses would sum with a short delay during immediate "reactive" saccades to visual stimuli. However, it is also possible that additional signals (suppression, delay) alter visual and/or motor response in the memory delay task. Here, we compared the spatiotemporal properties of visual and motor responses of the SC neurons recorded during both the reactive and memory-delay tasks in two head-unrestrained monkeys.
View Article and Find Full Text PDFThe frontal eye fields (FEFs) participate in both working memory and sensorimotor transformations for saccades, but their role in integrating these functions through time remains unclear. Here, we tracked FEF spatial codes through time using a novel analytic method applied to the classic memory-delay saccade task. Three-dimensional recordings of head-unrestrained gaze shifts were made in two monkeys trained to make gaze shifts toward briefly flashed targets after a variable delay (450-1500 ms).
View Article and Find Full Text PDFWe previously reported that visuomotor activity in the superior colliculus (SC)--a key midbrain structure for the generation of rapid eye movements--preferentially encodes target position relative to the eye (Te) during low-latency head-unrestrained gaze shifts (DeSouza et al., 2011). Here, we trained two monkeys to perform head-unrestrained gaze shifts after a variable post-stimulus delay (400-700 ms), to test whether temporally separated SC visual and motor responses show different spatial codes.
View Article and Find Full Text PDFBackground: Primates can remember and spatially update the visual direction of previously viewed objects during various types of self-motion. It is known that the brain "remaps" visual memory traces relative to gaze just before and after, but not during, discrete gaze shifts called saccades. However, it is not known how visual memory is updated during slow, continuous motion of the eyes.
View Article and Find Full Text PDFA fundamental question in sensorimotor control concerns the transformation of spatial signals from the retina into eye and head motor commands required for accurate gaze shifts. Here, we investigated these transformations by identifying the spatial codes embedded in visually evoked and movement-related responses in the frontal eye fields (FEFs) during head-unrestrained gaze shifts. Monkeys made delayed gaze shifts to the remembered location of briefly presented visual stimuli, with delay serving to dissociate visual and movement responses.
View Article and Find Full Text PDF