Publications by authors named "Naotoshi Abekawa"

Hierarchical brain-information-processing schemes have frequently assumed that the flexible but slow voluntary action modulates a direct sensorimotor process that can quickly generate a reaction in dynamical interaction. Here we show that the quick visuomotor process for manual movement is modulated by postural and visual instability contexts that are related but remote and prior states to manual movements. A preceding unstable postural context significantly enhanced the reflexive manual response induced by a large-field visual motion during hand reaching while the response was evidently weakened by imposing a preceding random-visual-motion context.

View Article and Find Full Text PDF

Numerous studies have proposed that our adaptive motor behaviors depend on learning a map between sensory information and limb movement, called an "internal model." From this perspective, how the brain represents internal models is a critical issue in motor learning, especially regarding their association with spatial frames processed in motor planning. Extensive experimental evidence suggests that during planning stages for visually guided hand reaching, the brain transforms visual target representations in gaze-centered coordinates to motor commands in limb coordinates, via hand-target vectors in workspace coordinates.

View Article and Find Full Text PDF

When reaching for an object with the hand, the gaze is usually directed at the target. In a laboratory setting, fixation is strongly maintained at the reach target until the reaching is completed, a phenomenon known as "gaze anchoring." While conventional accounts of such tight eye-hand coordination have often emphasized the internal synergetic linkage between both motor systems, more recent optimal control theories regard motor coordination as the adaptive solution to task requirements.

View Article and Find Full Text PDF

A fundamental but controversial question in information coding of moving visual target is which of 'motion' or 'position' signal is employed in the brain for producing quick motor reactions. Prevailing theory assumed that visually guided reaching is driven always via target position representation influenced by various motion signals (e.g.

View Article and Find Full Text PDF

When the inside texture of a moving object moves, the perceived motion of the object is often distorted toward the direction of the texture's motion (motion-induced position shift), and such perceptual distortion accumulates while the object is watched, causing what is known as the curveball illusion. In a recent study, however, the accumulation of the position error was not observed in saccadic eye movements. Here, we examined whether the position of the illusory object is represented independently in the perceptual and saccadic systems.

View Article and Find Full Text PDF

The body midline provides a basic reference for egocentric representation of external space. Clinical observations have suggested that vestibular information underpins egocentric representations. Here we aimed to clarify whether and how vestibular inputs contribute to egocentric representation in healthy volunteers.

View Article and Find Full Text PDF

To capture objects by hand, online motor corrections are required to compensate for self-body movements. Recent studies have shown that background visual motion, usually caused by body movement, plays a significant role in such online corrections. Visual motion applied during a reaching movement induces a rapid and automatic manual following response (MFR) in the direction of the visual motion.

View Article and Find Full Text PDF

When we perform a visually guided reaching action, the brain coordinates our hand and eye movements. Eye-hand coordination has been examined widely, but it remains unclear whether the hand and eye motor systems are coordinated during on-line visuomotor adjustments induced by a target jump during a reaching movement. As such quick motor responses are required when we interact with dynamic environments, eye and hand movements could be coordinated even during on-line motor control.

View Article and Find Full Text PDF

Information pertaining to visual motion is used in the brain not only for conscious perception but also for various kinds of motor controls. In contrast to the increasing amount of evidence supporting the dissociation of visual processing for action versus perception, it is less clear whether the analysis of visual input is shared for characterizing various motor outputs, which require different kinds of interactions with environments. Here we show that, in human visuomotor control, motion analysis for quick hand control is distinct from that for quick eye control in terms of spatiotemporal analysis and spatial integration.

View Article and Find Full Text PDF

We investigated a visuomotor mechanism contributing to reach correction: the manual following response (MFR), which is a quick response to background visual motion that frequently occurs as a reafference when the body moves. Although several visual specificities of the MFR have been elucidated, the functional and computational mechanisms of its motor coordination remain unclear mainly because it involves complex relationships among gaze, reaching target, and visual stimuli. To directly explore how these factors interact in the MFR, we assessed the impact of spatial coincidences among gaze, arm reaching, and visual motion on the MFR.

View Article and Find Full Text PDF

In addition to the goal-directed preplanned control, which strongly governs reaching movements, another type of control mechanism is suggested by recent findings that arm movements are rapidly entrained by surrounding visual motion. It remains, however, controversial whether this rapid manual response is generated in a goal-oriented manner similarly to preplanned control or is reflexively and directly induced by visual motion. To investigate the sensorimotor process underlying rapid manual responses induced by large-field visual motion, we examined the effects of contrast and spatiotemporal frequency of the visual-motion stimulus.

View Article and Find Full Text PDF