We investigated the nature of the control mechanisms at work during goal-oriented locomotion. In particular, we tested the effects of vision, locomotor speed, and the presence of via points on the geometric and kinematic properties of locomotor trajectories. We first observed that the average trajectories recorded in visual and nonvisual locomotion were highly comparable, suggesting the existence of vision-independent processes underlying the formation of locomotor trajectories. Then by analyzing and comparing the variability around the average trajectories across different experimental conditions, we were able to demonstrate the existence of on-line feedback control in both visual and nonvisual locomotion and to clarify the relations between visual and nonvisual control strategies. Based on these insights, we designed a model in which maximum-smoothness and optimal feedback control principles account, respectively, for the open-loop and feedback processes. Taken together, the experimental and modeling findings provide a novel understanding of the nature of the motor, sensory, and "navigational" processes underlying goal-oriented locomotion.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1152/jn.00284.2009 | DOI Listing |
Energy Build
February 2025
Department of Architectural Engineering, Penn State University, University Park, PA, USA, 16803.
Growing research on the non-visual impacts of light underscores the importance of architectural glazing systems in managing transmitted shortwave solar light and shaping indoor circadian light, vital for enhancing well-being. This study, conducted in two phases, evaluates the effectiveness of existing window properties in predicting their contribution to circadian lighting. Initially, a decision tree analysis assessed these properties and revealed that although traditional glazing metrics are not entirely accurate for circadian performance estimations, they can still be effective when supplemented with specific thresholds as rapid tools for selecting windows optimized for circadian health.
View Article and Find Full Text PDFiScience
December 2024
Department of Psychology, The Ohio State University, Columbus, OH 43210, USA.
The visual word form area (VWFA) is a region in the left ventrotemporal cortex (VTC) whose specificity remains contentious. Using precision fMRI, we examine the VWFA's responses to numerous visual and nonvisual stimuli, comparing them to adjacent category-selective visual regions and regions involved in language and attentional demand. We find that VWFA responds moderately to non-word visual stimuli, but is unique within VTC in its pronounced selectivity for visual words.
View Article and Find Full Text PDFSoybean ( [L.] Merr.) production is susceptible to biotic and abiotic stresses, exacerbated by extreme weather events.
View Article and Find Full Text PDFSci Rep
December 2024
Department of Biology, University of Oxford, 11a Mansfield Road, Oxford, OX1 3SZ, UK.
Mate availability and social information can influence mating behaviour in both males and females. Social information obtained from conspecifics can influence mate choice, particularly shown by studies on mate choice copying. However, the role of directly observing conspecific mating on mating behaviour has been less explored.
View Article and Find Full Text PDFTomography
December 2024
Department of Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA.
This research introduces BAE-ViT, a specialized vision transformer model developed for bone age estimation (BAE). This model is designed to efficiently merge image and sex data, a capability not present in traditional convolutional neural networks (CNNs). BAE-ViT employs a novel data fusion method to facilitate detailed interactions between visual and non-visual data by tokenizing non-visual information and concatenating all tokens (visual or non-visual) as the input to the model.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!