Ventral motion parallax enhances fruit fly steering to visual sideslip.

Biol Lett

Department of Biological Sciences, Florida International University, Miami, FL 33199, USA.

Published: May 2020

Flies and other insects use incoherent motion (parallax) to the front and sides to measure distances and identify obstacles during translation. Although additional depth information could be drawn from below, there is no experimental proof that they use it. The finding that blowflies encode motion disparities in their ventral visual fields suggests this may be an important region for depth information. We used a virtual flight arena to measure fruit fly responses to optic flow. The stimuli appeared below ( = 51) or above the fly ( = 44), at different speeds, with or without parallax cues. Dorsal parallax does not affect responses, and similar motion disparities in rotation have no effect anywhere in the visual field. But responses to strong ventral sideslip (206° s) change drastically depending on the presence or absence of parallax. Ventral parallax could help resolve ambiguities in cluttered motion fields, and enhance corrective responses to nearby objects.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7280038PMC
http://dx.doi.org/10.1098/rsbl.2020.0046DOI Listing

Publication Analysis

Top Keywords

motion parallax
8
fruit fly
8
motion disparities
8
parallax
6
ventral
4
ventral motion
4
parallax enhances
4
enhances fruit
4
fly steering
4
steering visual
4

Similar Publications

Holographic displays have the potential to reconstruct natural light field information, making them highly promising for applications in augmented reality (AR), head-up displays (HUD), and new types of transparent three-dimensional (3D) displays. However, current spatial light modulators (SLMs) are constrained by pixel size and resolution, limiting display size. Additionally, existing holographic displays have narrow viewing angles due to device diffraction limits, algorithms, and optical configurations.

View Article and Find Full Text PDF

High-quality light-field generation of real scenes based on view synthesis remains a significant challenge in three-dimensional (3D) light-field displays. Recent advances in neural radiance fields have greatly enhanced light-field generation. However, challenges persist in synthesizing high-quality cylindrical viewpoints within a short time.

View Article and Find Full Text PDF

Objects project different images when viewed from varying locations, but the visual system can correct perspective distortions and identify objects across viewpoints. This study investigated the conditions under which the visual system allocates computational resources to construct view-invariant, extraretinal representations, focusing on planar symmetry. When a symmetrical pattern lies on a plane, its symmetry in the retinal image is degraded by perspective.

View Article and Find Full Text PDF

Relating visual and pictorial space: Integration of binocular disparity and motion parallax.

J Vis

December 2024

BioMotionLab, Centre for Vision Research and Department of Biology, York University, Toronto, Ontario, Canada.

Traditionally, perceptual spaces are defined by the medium through which the visual environment is conveyed (e.g., in a physical environment, through a picture, or on a screen).

View Article and Find Full Text PDF

Sensory neurons often encode multisensory or multimodal signals. For example, many medial superior temporal (MST) neurons are tuned to heading direction of self-motion based on visual (optic flow) signals and vestibular signals. Middle temporal (MT) cortical neurons are tuned to object depth from signals of two visual modalities: motion parallax and binocular disparity.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!