Self-motion perception and vestibulo-ocular reflex (VOR) were studied during whole body yaw rotation in the dark at different static head positions. Rotations consisted of four cycles of symmetric sinusoidal and asymmetric oscillations. Self-motion perception was evaluated by measuring the ability of subjects to manually track a static remembered target. VOR was recorded separately and the slow phase eye position (SPEP) was computed. Three different head static yaw deviations (active and passive) relative to the trunk (0°, 45° to right and 45° to left) were examined. Active head deviations had a significant effect during asymmetric oscillation: the movement perception was enhanced when the head was kept turned toward the side of body rotation and decreased in the opposite direction. Conversely, passive head deviations had no effect on movement perception. Further, vibration (100 Hz) of the neck muscles splenius capitis and sternocleidomastoideus remarkably influenced perceived rotation during asymmetric oscillation. On the other hand, SPEP of VOR was modulated by active head deviation, but was not influenced by neck muscle vibration. Through its effects on motion perception and reflex gain, head position improved gaze stability and enhanced self-motion perception in the direction of the head deviation.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.humov.2010.10.005DOI Listing

Publication Analysis

Top Keywords

self-motion perception
16
head
9
perception vestibulo-ocular
8
vestibulo-ocular reflex
8
body yaw
8
yaw rotation
8
head position
8
active head
8
head deviations
8
asymmetric oscillation
8

Similar Publications

During locomotion, the visual system can factor out the motion component caused by observer locomotion from the complex target flow vector to obtain the world-relative target motion. This process, which has been termed flow parsing, is known to be incomplete, but viewing with both eyes could potentially aid in this task. Binocular disparity and binocular summation could both improve performance when viewing with both eyes.

View Article and Find Full Text PDF

Motor information contributes to visuotactile interaction in trunk-centered peripersonal space during a pedaling situation.

Exp Brain Res

December 2024

Faculty of Humanities and Social Sciences (Psychology), Kumamoto University, 2-40-1 Kurokami, Kumamoto, 860-8555, Japan.

Peripersonal space (PPS), the space immediately surrounding one's body, contributes to interactions with the external environment. Previous studies have demonstrated that PPS expands during whole-body self-motion. Furthermore, motor and proprioceptive information contributes to this phenomenon.

View Article and Find Full Text PDF

The current study sought to examine factors that affect vection (the illusory experience of self-motion in the absence of real motion), visually-induced motion sickness, and one's sense of presence in a passive virtual reality driving simulation by exposing participants to 60-s pre-recorded driving laps and recording their self-reported metrics as well as their head motion patterns during the laps. Faster virtual driving speed (average 120 mph vs. 60 mph) resulted in significantly higher ratings of vection and motion sickness.

View Article and Find Full Text PDF

Aging-associated decline in peripheral vestibular function is linked to deficits in executive ability, self-motion perception, and motor planning and execution. While these behaviors are known to rely on the sensorimotor and frontal cortices, the precise pathways involving the frontal and sensorimotor cortices in these vestibular-associated behaviors are unknown. To fill this knowledge gap, this cross-sectional study investigates the relationship between age-related variation in vestibular function and surface shape alterations of the frontal and sensorimotor cortices, considering age, intracranial volume, and sex.

View Article and Find Full Text PDF

Sensory neurons often encode multisensory or multimodal signals. For example, many medial superior temporal (MST) neurons are tuned to heading direction of self-motion based on visual (optic flow) signals and vestibular signals. Middle temporal (MT) cortical neurons are tuned to object depth from signals of two visual modalities: motion parallax and binocular disparity.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!