We compared scanpath similarity in response to repeated presentations of social and nonsocial images representing natural scenes in a sample of 30 participants with autism spectrum disorder and 32 matched typically developing individuals. We used scanpath similarity (calculated using ScanMatch) as a novel measure of attentional bias or preference, which constrains eye-movement patterns by directing attention to specific visual or semantic features of the image. We found that, compared with the control group, scanpath similarity of participants with autism was significantly higher in response to nonsocial images, and significantly lower in response to social images. Moreover, scanpaths of participants with autism were more similar to scanpaths of other participants with autism in response to nonsocial images, and less similar in response to social images. Finally, we also found that in response to nonsocial images, scanpath similarity of participants with autism did not decline with stimulus repetition to the same extent as in the control group, which suggests more perseverative attention in the autism spectrum disorder group. These results show a preferential fixation on certain elements of social stimuli in typically developing individuals compared with individuals with autism, and on certain elements of nonsocial stimuli in the autism spectrum disorder group, compared with the typically developing group.

Download full-text PDF

Source
http://dx.doi.org/10.1177/1362361319865809DOI Listing

Publication Analysis

Top Keywords

scanpath similarity
20
participants autism
20
nonsocial images
16
autism spectrum
12
spectrum disorder
12
typically developing
12
response nonsocial
12
autism
9
individuals autism
8
developing individuals
8

Similar Publications

Studying attention to IPCC climate change maps with mobile eye-tracking.

PLoS One

January 2025

Faculty of Philosophy, Philosophy of Science and the Study of Religion, Ludwig Maximilian University of Munich, München, Germany.

Many visualisations used in the climate communication field aim to present the scientific models of climate change to the public. However, relatively little research has been conducted on how such data are visually processed, particularly from a behavioural science perspective. This study examines trends in visual attention to climate change predictions in world maps using mobile eye-tracking while participants engage with the visualisations.

View Article and Find Full Text PDF

Previous research has indicated that individuals with varying levels of reading comprehension (often used as a proxy for general cognitive ability) employ distinct reading eye movement patterns. This exploratory eye-tracking study aimed to investigate the text-reading process in adolescents with differing reading comprehension, specifically examining how these differences manifest at the global eye movement level through scanpath analysis. Our findings revealed two distinct groups of scanpaths characterized by statistically significant differences in eye movement parameters.

View Article and Find Full Text PDF
Article Synopsis
  • Visual search gets easier when you find something in the same spot multiple times, especially if the other confusing stuff stays the same.
  • Scientists usually think this happens because our brains remember the exact spots where we saw something before.
  • A new idea suggests we actually get better at looking around in general, using better eye movement strategies that help us search for things, especially at places we look for often.
View Article and Find Full Text PDF

To predict physician fixations specifically on ophthalmology optical coherence tomography (OCT) reports from eye tracking data using CNN based saliency prediction methods in order to aid in the education of ophthalmologists and ophthalmologists-in-training. Fifteen ophthalmologists were recruited to each examine 20 randomly selected OCT reports and evaluate the likelihood of glaucoma for each report on a scale of 0-100. Eye movements were collected using a Pupil Labs Core eye-tracker.

View Article and Find Full Text PDF

Previous work has demonstrated similarities and differences between aerial and terrestrial image viewing. Aerial scene categorization, a pivotal visual processing task for gathering geoinformation, heavily depends on rotation-invariant information. Aerial image-centered research has revealed effects of low-level features on performance of various aerial image interpretation tasks.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!