A central topic in sentence comprehension research is the kinds of information and mechanisms involved in resolving temporary ambiguity regarding the syntactic structure of a sentence. Gaze patterns in scenes during spoken sentence comprehension have provided strong evidence that visual scenes trigger rapid syntactic reanalysis. However, they have also been interpreted as reflecting nonlinguistic, visual processes. Furthermore, little is known as to whether similar processes of syntactic revision are triggered by linguistic versus scene cues. To better understand how scenes influence comprehension and its time course, we recorded event-related potentials (ERPs) during the comprehension of spoken sentences that relate to depicted events. Prior electrophysiological research has observed a P600 when structural disambiguation toward a noncanonical structure occurred during reading and in the absence of scenes. We observed an ERP component with a similar latency, polarity, and distribution when depicted events disambiguated toward a noncanonical structure. The distributional similarities further suggest that scenes are on a par with linguistic contexts in triggering syntactic revision. Our findings confirm the interpretation of previous eye movement studies and highlight the benefits of combining ERP and eye-tracking measures to ascertain the neuronal processes enabled by, and the locus of attention in, visual contexts.

Download full-text PDF

Source
http://dx.doi.org/10.1093/cercor/bhm121DOI Listing

Publication Analysis

Top Keywords

visual scenes
8
scenes trigger
8
syntactic reanalysis
8
sentence comprehension
8
syntactic revision
8
depicted events
8
noncanonical structure
8
syntactic
5
comprehension
5
scenes
5

Similar Publications

Objective: What we hear may influence postural control, particularly in people with vestibular hypofunction. Would hearing a moving subway destabilize people similarly to seeing the train move? We investigated how people with unilateral vestibular hypofunction and healthy controls incorporated broadband and real-recorded sounds with visual load for balance in an immersive contextual scene.

Design: Participants stood on foam placed on a force-platform, wore the HTC Vive headset, and observed an immersive subway environment.

View Article and Find Full Text PDF

Epitome of the Region-Regional Nostalgia Design Based on Digital Twins.

Behav Sci (Basel)

December 2024

Smart Design Lab, School of Design, Northumbria University, Newcastle upon Tyne NE1 8ST, UK.

Nostalgic scenes can trigger nostalgia to a considerable extent and can be effectively used as a nostalgic trigger that contributes to the psychological comfort of the elderly and immigrant populations, but a design system has not been adequately studied. Therefore, the design principles and digital twin (DT) design system of nostalgic scenes is proposed in this study. It focuses on the construction of a nostalgic scene DT model based on the system of system (SoS) theory.

View Article and Find Full Text PDF

Arbitrary translational Six Degrees of Freedom (6DoF) video represents a transitional stage towards immersive terminal videos, allowing users to freely switch viewpoints for a 3D scene experience. However, the increased freedom of movement introduces new distortions that significantly impact human visual perception quality. Therefore, it is crucial to explore quality assessment (QA) to validate its application feasibility.

View Article and Find Full Text PDF

Significance: Decoding naturalistic content from brain activity has important neuroscience and clinical implications. Information about visual scenes and intelligible speech has been decoded from cortical activity using functional magnetic resonance imaging (fMRI) and electrocorticography, but widespread applications are limited by the logistics of these technologies.

Aim: High-density diffuse optical tomography (HD-DOT) offers image quality approaching that of fMRI but with the silent, open scanning environment afforded by optical methods, thus opening the door to more naturalistic research and applications.

View Article and Find Full Text PDF

Historically, electrophysiological correlates of scene processing have been studied with experiments using static stimuli presented for discrete timescales where participants maintain a fixed eye position. Gaps remain in generalizing these findings to real-world conditions where eye movements are made to select new visual information and where the environment remains stable but changes with our position and orientation in space, driving dynamic visual stimulation. Co-recording of eye movements and electroencephalography (EEG) is an approach to leverage fixations as time-locking events in the EEG recording under free-viewing conditions to create fixation-related potentials (FRPs), providing a neural snapshot in which to study visual processing under naturalistic conditions.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!