Children's gaze behavior reflects emergent linguistic knowledge and real-time language processing of speech, but little is known about naturalistic gaze behaviors while watching signed narratives. Measuring gaze patterns in signing children could uncover how they master perceptual gaze control during a time of active language learning. Gaze patterns were recorded using a Tobii X120 eye tracker, in 31 non-signing and 30 signing hearing infants (5-14 months) and children (2-8 years) as they watched signed narratives on video. Intelligibility of the signed narratives was manipulated by presenting them naturally and in video-reversed ("low intelligibility") conditions. This video manipulation was used because it distorts semantic content, while preserving most surface phonological features. We examined where participants looked, using linear mixed models with Language Group (non-signing vs. signing) and Video Condition (Forward vs. Reversed), controlling for trial order. Non-signing infants and children showed a preference to look at the face as well as areas below the face, possibly because their gaze was drawn to the moving articulators in signing space. Native signing infants and children demonstrated resilient, face-focused gaze behavior. Moreover, their gaze behavior was unchanged for video-reversed signed narratives, similar to what was seen for adult native signers, possibly because they already have efficient highly focused gaze behavior. The present study demonstrates that human perceptual gaze control is sensitive to visual language experience over the first year of life and emerges early, by 6 months of age. Results have implications for the critical importance of early visual language exposure for deaf infants. A video abstract of this article can be viewed at https://www.youtube.com/watch?v=2ahWUluFAAg.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8284428 | PMC |
http://dx.doi.org/10.1111/desc.13086 | DOI Listing |
Psych J
January 2025
Department of Psychology, Keio University, Tokyo, Japan.
From infancy, we spend considerable time absorbing social information from the external world. Social information processing, which starts with looking at facial expressions, affects behavior and cognition. Previous research has demonstrated that looking behaviors at social cues such as faces may differ in individuals with autism spectrum disorder (ASD) by using eye-tracking studies with real photographs and movies.
View Article and Find Full Text PDFAccid Anal Prev
January 2025
Department of Civil Engineering, Indian Institute of Technology Roorkee, Roorkee, 247667, India. Electronic address:
Pedestrians use visual cues (i.e., gaze) to communicate with the other road users, and visual attention towards the surrounding environment is essential to be situationally aware and avoid oncoming conflicts.
View Article and Find Full Text PDFSci Rep
January 2025
College of Computer Sciences, Anhui University, Hefei, 230039, China.
Decoding the semantic categories of complex sceneries is fundamental to numerous artificial intelligence (AI) infrastructures. This work presents an advanced selection of multi-channel perceptual visual features for recognizing scenic images with elaborate spatial structures, focusing on developing a deep hierarchical model dedicated to learning human gaze behavior. Utilizing the BING objectness measure, we efficiently localize objects or their details across varying scales within scenes.
View Article and Find Full Text PDFJ Intellect Disabil
January 2025
Pro Vice Chancellor, Staffordshire University, UK.
Background: Autism spectrum disorder poses challenges in social communication and behavior, while Intellectual disabilities are characterized by deficits in cognitive, social, and adaptive skills, frequently accompanied by stereotypies and challenging behaviors. Despite the progress made in autism spectrum disorder research, there is often a lack of research focusing on individuals with co-occurring autism spectrum disorder and intellectual disability. Robot-assisted autism therapies are effective in addressing these needs.
View Article and Find Full Text PDFComput Biol Med
January 2025
Institute of Informatics, Federal University of Goiás, GO, Brazil.
The Pupillary Light Reflex (PLR) is the involuntary movement of the pupil adapting to lighting conditions. The measurement and qualification of this information have a broad impact in different fields. Thanks to technological advancements and algorithms, obtaining accurate and non-invasive records of pupillary movements is now possible, expanding practical applications.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!