Publications by authors named "Hiroaki Gomi"

Self vs. external attribution of motions based on vestibular cues is suggested to underlie our coherent perception of object motion and self-motion. However, it remains unclear whether such attribution also underlies sensorimotor responses.

View Article and Find Full Text PDF

When we run our hand across a surface, each finger typically repeats the sensory stimulation that the leading finger has already experienced. Because of this redundancy, the leading finger may attract more attention and contribute more strongly when tactile signals are integrated across fingers to form an overall percept. To test this hypothesis, we re-analyzed data collected in a previous study (Arslanova I, Takamuku S, Gomi H, Haggard P, 128: 418-433, 2022), where two probes were moved in different directions on two different fingerpads and participants reported the probes' average direction.

View Article and Find Full Text PDF

Primate hands house an array of mechanoreceptors and proprioceptors, which are essential for tactile and kinematic information crucial for daily motor action. While the regulation of these somatosensory signals is essential for hand movements, the specific central nervous system (CNS) location and mechanism remain unclear. Our study demonstrates the attenuation of somatosensory signals in the cuneate nucleus during voluntary movement, suggesting significant modulation at this initial relay station in the CNS.

View Article and Find Full Text PDF

Various functional modulations of the stretch reflex help to stabilize actions, but the computational mechanism behind its context-dependent tuning remains unclear. While many studies have demonstrated that motor contexts associated with the task goal cause functional modulation of the stretch reflex of upper limbs, it is not well understood how visual contexts independent of the task requirements affect the stretch reflex. To explore this issue, we conducted two experiments testing 20 healthy human participants (age range 20-45, average 31.

View Article and Find Full Text PDF

Bodily self-awareness relies on a constant integration of visual, tactile, proprioceptive, and motor signals. In the 'rubber hand illusion' (RHI), conflicting visuo-tactile stimuli lead to changes in self-awareness. It remains unclear whether other, somatic signals could compensate for the alterations in self-awareness caused by visual information about the body.

View Article and Find Full Text PDF

Visual motion analysis is crucial for humans to detect external moving objects and self-motion which are informative for planning and executing actions for various interactions with environments. Here we show that the image motion analysis trained to decode the self-motion during human natural movements by a convolutional neural network exhibits similar specificities with the reflexive ocular and manual responses induced by a large-field visual motion, in terms of stimulus spatiotemporal frequency tuning. The spatiotemporal frequency tuning of the decoder peaked at high-temporal and low-spatial frequencies, as observed in the reflexive ocular and manual responses, but differed significantly from the frequency power of the visual image itself and the density distribution of self-motion.

View Article and Find Full Text PDF

Hierarchical brain-information-processing schemes have frequently assumed that the flexible but slow voluntary action modulates a direct sensorimotor process that can quickly generate a reaction in dynamical interaction. Here we show that the quick visuomotor process for manual movement is modulated by postural and visual instability contexts that are related but remote and prior states to manual movements. A preceding unstable postural context significantly enhanced the reflexive manual response induced by a large-field visual motion during hand reaching while the response was evidently weakened by imposing a preceding random-visual-motion context.

View Article and Find Full Text PDF

During the haptic exploration of a planar surface, slight resistances against the hand's movement are illusorily perceived as asperities (bumps) in the surface. If the surface being touched is one's own skin, an actual bump would also produce increased tactile pressure from the moving finger onto the skin. We investigated how kinaesthetic and tactile signals combine to produce haptic perceptions during self-touch.

View Article and Find Full Text PDF

Sensory prediction-error is vital to discriminating whether sensory inputs are caused externally or are the consequence of self-action, thereby contributing to a stable perception of the external world and building sense of agency. However, it remains unexplored whether prediction error of self-action is also used to estimate the internal body condition. To address this point, we examined whether prediction error affects the perceived intensity of muscle fatigue.

View Article and Find Full Text PDF

Directional tactile pulling sensations are integral to everyday life, but their neural mechanisms remain unknown. Prior accounts hold that primary somatosensory (SI) activity is sufficient to generate pulling sensations, with alternative proposals suggesting that amodal frontal or parietal regions may be critical. We combined high-density EEG with asymmetric vibration, which creates an illusory pulling sensation, thereby unconfounding pulling sensations from unrelated sensorimotor processes.

View Article and Find Full Text PDF

Interactions with objects involve simultaneous contact with multiple, not necessarily adjacent, skin regions. Although advances have been made in understanding the capacity to selectively attend to a single tactile element among distracting stimulations, here, we examine how multiple stimulus elements are explicitly integrated into an overall tactile percept. Across four experiments, participants averaged the direction of two simultaneous tactile motion trajectories of varying discrepancy delivered to different fingerpads.

View Article and Find Full Text PDF

Numerous studies have proposed that our adaptive motor behaviors depend on learning a map between sensory information and limb movement, called an "internal model." From this perspective, how the brain represents internal models is a critical issue in motor learning, especially regarding their association with spatial frames processed in motor planning. Extensive experimental evidence suggests that during planning stages for visually guided hand reaching, the brain transforms visual target representations in gaze-centered coordinates to motor commands in limb coordinates, via hand-target vectors in workspace coordinates.

View Article and Find Full Text PDF

During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit spatial information provided by the intensity of motor command signals. To identify which has the primary role in spatial perception, we developed experiments based on everyday self-touch, in which the right hand strokes the left arm.

View Article and Find Full Text PDF

Humans continuously adapt their movement to a novel environment by recalibrating their sensorimotor system. Recent evidence, however, shows that explicit planning to compensate for external changes, i.e.

View Article and Find Full Text PDF

Can we recover self-motion from vision? This basic issue remains unsolved since, while the human visual system is known to estimate the of self-motion from optic flow, it remains unclear whether it also estimates the . Importantly, the latter requires disentangling self-motion speed and depths of objects in the scene as retinal velocity depends on both. Here we show that our automatic regulator of walking speed based on vision, which estimates and maintains the speed to its preferred range by adjusting stride length, is robust to changes in the depths.

View Article and Find Full Text PDF

Understanding information processing in the brain-and creating general-purpose artificial intelligence-are long-standing aspirations of scientists and engineers worldwide. The distinctive features of human intelligence are high-level cognition and control in various interactions with the world including the self, which are not defined in advance and are vary over time. The challenge of building human-like intelligent machines, as well as progress in brain science and behavioural analyses, robotics, and their associated theoretical formalisations, speaks to the importance of the world-model learning and inference.

View Article and Find Full Text PDF

Our brain can be recognized as a network of largely hierarchically organized neural circuits that operate to control specific functions, but when acting in parallel, enable the performance of complex and simultaneous behaviors. Indeed, many of our daily actions require concurrent information processing in sensorimotor, associative, and limbic circuits that are dynamically and hierarchically modulated by sensory information and previous learning. This organization of information processing in biological organisms has served as a major inspiration for artificial intelligence and has helped to create in silico systems capable of matching or even outperforming humans in several specific tasks, including visual recognition and strategy-based games.

View Article and Find Full Text PDF

When reaching for an object with the hand, the gaze is usually directed at the target. In a laboratory setting, fixation is strongly maintained at the reach target until the reaching is completed, a phenomenon known as "gaze anchoring." While conventional accounts of such tight eye-hand coordination have often emphasized the internal synergetic linkage between both motor systems, more recent optimal control theories regard motor coordination as the adaptive solution to task requirements.

View Article and Find Full Text PDF

Previous studies (Haswell et al. in Nat Neurosci 12:970-972, 2009; Marko et al. in Brain J Neurol 138:784-797, 2015) reported that people with autism rely less on vision for learning to reach in a force field.

View Article and Find Full Text PDF

Perception of space has puzzled scientists since antiquity, and is among the foundational questions of scientific psychology. Classical "local sign" theories assert that perception of spatial extent ultimately derives from efferent signals specifying the intensity of motor commands. Everyday cases of self-touch, such as stroking the left forearm with the right index fingertip, provide an important platform for studying spatial perception, because of the tight correlation between motor and tactile extents.

View Article and Find Full Text PDF

Many perceptual studies focus on the brain's capacity to discriminate between stimuli. However, our normal experience of the world also involves integrating multiple stimuli into a single perceptual event. Neural mechanisms such as lateral inhibition are believed to enhance local differences between sensory inputs from nearby regions of the receptor surface.

View Article and Find Full Text PDF

Control of the body requires inhibiting complex actions, involving contracting and relaxing muscles. However, little is known of how voluntary commands to relax a muscle are cancelled. Action inhibition causes both suppression of muscle activity and the transient excitation of antagonist muscles, the latter being termed active breaking.

View Article and Find Full Text PDF

Fast signaling from vision and proprioception to muscle activation plays essential roles in quickly correcting movement. Though many studies have demonstrated modulation of the quick sensorimotor responses as depending on context in each modality, the contribution of multimodal information has not been established. Here, we examined whether state estimates contributing to stretch reflexes are represented solely by proprioceptive information or by multimodal information.

View Article and Find Full Text PDF

Estimating forces acting between our hand and objects is essential for dexterous motor control. An earlier study suggested that vision contributes to the estimation by demonstrating changes in grip force pattern caused by delayed visual feedback. However, two possible vision-based force estimation processes, one based on hand position and another based on object motion, were both able to explain the effect.

View Article and Find Full Text PDF

A fundamental but controversial question in information coding of moving visual target is which of 'motion' or 'position' signal is employed in the brain for producing quick motor reactions. Prevailing theory assumed that visually guided reaching is driven always via target position representation influenced by various motion signals (e.g.

View Article and Find Full Text PDF