Amblyopia is a developmental disorder that results from abnormal visual experience in early life. Amblyopia typically reduces visual performance in one eye. We studied the representation of visual motion information in area MT and nearby extrastriate visual areas in two monkeys made amblyopic by creating an artificial strabismus in early life, and in a single age-matched control monkey.
View Article and Find Full Text PDFFunctional brain development is not well understood. In the visual system, neurophysiological studies in nonhuman primates show quite mature neuronal properties near birth although visual function is itself quite immature and continues to develop over many months or years after birth. Our goal was to assess the relative development of two main visual processing streams, dorsal and ventral, using BOLD fMRI in an attempt to understand the global mechanisms that support the maturation of visual behavior.
View Article and Find Full Text PDFHow the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness.
View Article and Find Full Text PDFThe auditory system represents sound-source directions initially in head-centered coordinates. To program eye-head gaze shifts to sounds, the orientation of eyes and head should be incorporated to specify the target relative to the eyes. Here we test (1) whether this transformation involves a stage in which sounds are represented in a world- or a head-centered reference frame, and (2) whether acoustic spatial updating occurs at a topographically organized motor level representing gaze shifts, or within the tonotopically organized auditory system.
View Article and Find Full Text PDFWe studied the influence of static head roll on the perceived auditory zenith in head-centred and world-centred coordinates. Subjects sat either upright, or with their head left/right rolled sideways by about 35° relative to gravity, whilst judging whether a broadband sound was heard left or right from the head-centred or world-centred zenith. When upright, these reference frames coincide.
View Article and Find Full Text PDFThe double magnetic induction (DMI) method has successfully been used to record head-unrestrained gaze shifts in human subjects (Bremen et al., J Neurosci Methods 160:75-84, 2007a, J Neurophysiol, 98:3759-3769, 2007b). This method employs a small golden ring placed on the eye that, when positioned within oscillating magnetic fields, induces orientation-dependent voltages in a pickup coil in front of the eye.
View Article and Find Full Text PDFTo generate an accurate saccade toward a sound in darkness requires a transformation of the head-centered sound location into an oculocentric motor command, which necessitates the use of an eye-in-head position signal. We tested whether this transformation uses a continuous representation of eye position by exploiting the property that the oculomotor neural integrator is leaky with a time constant of ∼20 s. Hence in complete darkness, the eyes tend to drift toward a neutral position.
View Article and Find Full Text PDFOrienting the eyes towards a peripheral sound source calls for a transformation of the head-centred sound coordinates into an oculocentric motor command, which requires an estimate of current eye position. Current models of saccadic control explain spatial accuracy by oculocentric transformations that rely on efference copies of relative eye-displacement signals, rather than on absolute eye position in the orbit. In principle, the gaze-control system could keep track of instantaneous eye position by vector addition of intervening eye-displacement commands.
View Article and Find Full Text PDFVisual stimuli are initially represented in a retinotopic reference frame. To maintain spatial accuracy of gaze (i.e.
View Article and Find Full Text PDFHuman sound localization relies on implicit head-centered acoustic cues. However, to create a stable and accurate representation of sounds despite intervening head movements, the acoustic input should be continuously combined with feedback signals about changes in head orientation. Alternatively, the auditory target coordinates could be updated in advance by using either the preprogrammed gaze-motor command or the sensory target coordinates to which the intervening gaze shift is made ("predictive remapping").
View Article and Find Full Text PDF