Introduction: A full understanding of how we see our world remains a fundamental research question in vision neuroscience. While topographic profiling has allowed us to identify different visual areas, the exact functional characteristics and organization of areas up in the visual hierarchy (beyond V1 & V2) is still debated. It is hypothesized that visual area V4 represents a vital intermediate stage of processing spatial and curvature information preceding object recognition. Advancements in magnetic resonance imaging hardware and acquisition techniques (e.g., non-BOLD functional MRI) now permits the capture of cortical layer-specific functional properties and organization of the human brain (including the visual system) at high precision.
Methods: Here, we use functional cerebral blood volume measures to study the modularity in how responses to contours (curvature) are organized within area V4 of the human brain. To achieve this at 3 Tesla (a clinically relevant field strength) we utilize optimized high-resolution 3D-Echo Planar Imaging (EPI) Vascular Space Occupancy (VASO) measurements.
Results: Data here provide the first evidence of curvature domains in human V4 that are consistent with previous findings from non-human primates. We show that VASO and BOLD tSNR maps for functional imaging align with high field equivalents, with robust time series of changes to visual stimuli measured across the visual cortex. V4 curvature preference maps for VASO show strong modular organization compared to BOLD imaging contrast. It is noted that BOLD has a much lower sensitivity (due to known venous vasculature weightings) and specificity to stimulus contrast. We show evidence that curvature domains persist across the cortical depth. The work advances our understanding of the role of mid-level area V4 in human processing of curvature and shape features.
Impact: Knowledge of how the functional architecture and hierarchical integration of local contours (curvature) contribute to formation of shapes can inform computational models of object recognition. Techniques described here allow for quantification of individual differences in functional architecture of mid-level visual areas to help drive a better understanding of how changes in functional brain organization relate to difference in visual perception.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11897262 | PMC |
http://dx.doi.org/10.3389/fnins.2025.1537026 | DOI Listing |
Front Neurosci
February 2025
York Neuroimaging Centre, University of York, York, United Kingdom.
Introduction: A full understanding of how we see our world remains a fundamental research question in vision neuroscience. While topographic profiling has allowed us to identify different visual areas, the exact functional characteristics and organization of areas up in the visual hierarchy (beyond V1 & V2) is still debated. It is hypothesized that visual area V4 represents a vital intermediate stage of processing spatial and curvature information preceding object recognition.
View Article and Find Full Text PDFAm J Ophthalmol
March 2025
Department of ophthalmology, Kangwon National University School of Medicine, Kangwon National University Hospital, Chuncheon, Republic of Korea. Electronic address:
Objective: To compare lamina cribrosa (LC) parameters in non-glaucomatous eyes with pseudoexfoliation syndrome (PXFS) and healthy control eyes to assess structural alterations that may contribute to early glaucoma pathophysiology.
Design: Retrospective, cross-sectional study.
Participants: Fifty eyes with non-glaucomatous PXFS and 50 healthy, age-matched control eyes.
IEEE J Biomed Health Inform
February 2025
Domain shifts between samples acquired with different instruments are one of the major challenges in accurate segmentation of Optical Coherence Tomography (OCT) images. Given that OCT images may be acquired with different devices in different clinical centers, this study presents astyle and structure data augmentation (SSDA) method to improve the adaptability of segmentation models. Inspired by our initial analysis of OCT domain differences, we propose an innovative hypothesis that domain shifts are primarily due to differences in image style and anatomical structure, which further guides the design of our method.
View Article and Find Full Text PDFIEEE Trans Med Imaging
December 2024
Handheld ultrasound devices face usage limitations due to user inexperience and cannot benefit from supervised deep learning without extensive expert annotations. Moreover, the models trained on standard ultrasound device data are constrained by training data distribution and perform poorly when directly applied to handheld device data. In this study, we propose the Training-free Image Style Alignment (TISA) to align the style of handheld device data to those of standard devices.
View Article and Find Full Text PDFJ Colloid Interface Sci
February 2025
Department of Chemistry, KTH Royal Institute of Technology, School of Engineering Sciences in Chemistry, Biotechnology and Health, Teknikringen 30, 100 44 Stockholm, Sweden; Materials and Surface Design, RISE Research Institutes of Sweden, Box 5607, SE-114 86 Stockholm, Sweden; University of New South Wales, Sydney 2052, Australia; Laboratoire de Tribologie et Dynamique des Systèmes, École Centrale de Lyon, Lyon 69130, France. Electronic address:
Long, straight chain saturated fatty acids form homogeneous, featureless monolayers on a supramolecular length scale at the water-air interface. In contrast, a naturally occurring saturated branched fatty acid, 18-methyl eicosanoic acid (18-MEA) has been observed to form three-dimensional domains of size 20-80 nm, using a combination of Langmuir trough, Atomic Force Microscopy (AFM) images of the deposited monolayers, and Neutron reflectometry (NR) and X-Ray reflectometry (XRR). It is hypothesized that these domains result from the curvature of the water surface induced by the steric constraints of the methyl branch.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!