63 results match your criteria: "Lund University Humanities Lab[Affiliation]"

Researchers using eye tracking are heavily dependent on software and hardware tools to perform their studies, from recording eye tracking data and visualizing it, to processing and analyzing it. This article provides an overview of available tools for research using eye trackers and discusses considerations to make when choosing which tools to adopt for one's study.

View Article and Find Full Text PDF

Irrespective of the precision, the inaccuracy of a pupil-based eye tracker is about 0.5 . This paper delves into two factors that potentially increase the inaccuracy of the gaze signal, namely, 1) Pupil-size changes and the pupil-size artefact (PSA) and 2) the putative inability of experienced individuals to precisely refixate a visual target.

View Article and Find Full Text PDF

The fundamentals of eye tracking part 1: The link between theory and research question.

Behav Res Methods

December 2024

Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, The Netherlands.

Eye tracking technology has become increasingly prevalent in scientific research, offering unique insights into oculomotor and cognitive processes. The present article explores the relationship between scientific theory, the research question, and the use of eye-tracking technology. It aims to guide readers in determining if eye tracking is suitable for their studies and how to formulate relevant research questions.

View Article and Find Full Text PDF

Accurate eye tracking is crucial for gaze-dependent research, but calibrating eye trackers in subjects who cannot follow instructions, such as human infants and nonhuman primates, presents a challenge. Traditional calibration methods rely on verbal instructions, which are ineffective for these populations. To address this, researchers often use attention-grabbing stimuli in known locations; however, existing software for video-based calibration is often proprietary and inflexible.

View Article and Find Full Text PDF

Gaze-action coupling, gaze-gesture coupling, and exogenous attraction of gaze in dyadic interactions.

Atten Percept Psychophys

November 2024

Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584CS, Utrecht, Netherlands.

In human interactions, gaze may be used to acquire information for goal-directed actions, to acquire information related to the interacting partner's actions, and in the context of multimodal communication. At present, there are no models of gaze behavior in the context of vision that adequately incorporate these three components. In this study, we aimed to uncover and quantify patterns of within-person gaze-action coupling, gaze-gesture and gaze-speech coupling, and coupling between one person's gaze and another person's manual actions, gestures, or speech (or exogenous attraction of gaze) during dyadic collaboration.

View Article and Find Full Text PDF

Background: White noise stimulation has demonstrated efficacy in enhancing working memory in children with ADHD. However, its impact on other executive functions commonly affected by ADHD, such as inhibitory control, remains largely unexplored. This research aims to explore the effects of two types of white noise stimulation on oculomotor inhibitory control in children with ADHD.

View Article and Find Full Text PDF

Background: In attention-deficit/hyperactivity disorder (ADHD), poor inhibitory control is one of the main characteristics, with oculomotor inhibition impairments being considered a potential biomarker of the disorder. While auditory white noise has demonstrated the ability to enhance working memory in this group, visual white noise is still unexplored and so are the effects of both types of white noise stimulation on oculomotor inhibition.

Objective: This crossover study aims to explore the impact of auditory and visual white noise on oculomotor inhibition in children with ADHD and typically developing (TD) children.

View Article and Find Full Text PDF

When lab resources are shared among multiple research projects, issues such as experimental integrity, replicability, and data safety become important. Different research projects often need different software and settings that may well conflict with one another, and data collected for one project may not be safeguarded from exposure to researchers from other projects. In this paper we provide an infrastructure design and an open-source tool, labManager, that render multi-user lab facilities in the behavioral sciences accessible to research projects with widely varying needs.

View Article and Find Full Text PDF

Introduction: Temporal co-ordination between speech and gestures has been thoroughly studied in natural production. In most cases gesture strokes precede or coincide with the stressed syllable in words that they are semantically associated with.

Methods: To understand whether processing of speech and gestures is attuned to such temporal coordination, we investigated the effect of delaying, preposing or eliminating individual gestures on the memory for words in an experimental study in which 83 participants watched video sequences of naturalistic 3D-animated speakers generated based on motion capture data.

View Article and Find Full Text PDF

Self-monitoring is essential for effectively regulating learning, but difficult in visual diagnostic tasks such as radiograph interpretation. Eye-tracking technology can visualize viewing behavior in gaze displays, thereby providing information about visual search and decision-making. We hypothesized that individually adaptive gaze-display feedback improves posttest performance and self-monitoring of medical students who learn to detect nodules in radiographs.

View Article and Find Full Text PDF

What is a blink? Classifying and characterizing blinks in eye openness signals.

Behav Res Methods

April 2024

Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584, CS, Utrecht, The Netherlands.

Blinks, the closing and opening of the eyelids, are used in a wide array of fields where human function and behavior are studied. In data from video-based eye trackers, blink rate and duration are often estimated from the pupil-size signal. However, blinks and their parameters can be estimated only indirectly from this signal, since it does not explicitly contain information about the eyelid position.

View Article and Find Full Text PDF

Purpose: According to most models of spoken word recognition, listeners probabilistically activate a set of lexical candidates, which is incrementally updated as the speech signal unfolds. Speech carries segmental (speech sound) as well as suprasegmental (prosodic) information. The role of the latter in spoken word recognition is less clear.

View Article and Find Full Text PDF

We built a novel setup to record large gaze shifts (up to 140 ). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye-head gaze shift literature.

View Article and Find Full Text PDF

We present a deep learning method for accurately localizing the center of a single corneal reflection (CR) in an eye image. Unlike previous approaches, we use a convolutional neural network (CNN) that was trained solely using synthetic data. Using only synthetic data has the benefit of completely sidestepping the time-consuming process of manual annotation that is required for supervised training on real eye images.

View Article and Find Full Text PDF

Previous work has shown that exposure to auditory white noise (WN) can improve cognitive performance in children with ADHD, but it is unknown whether this improvement generalizes to other sensory modalities. To address this knowledge gap, we tested the effect of Stochastic Vestibular Stimulation (SVS) on cognitive performance and reaction time (RT) variability in two groups: children with ADHD and typically developing children (TDC). Children with ADHD (N=42) and TDC (N=28) performed three cognitive tasks (Spanboard, Word Recall and N-back tasks) at two different occasions, with and without exposure to SVS, in a double blinded design.

View Article and Find Full Text PDF

Minimal reporting guideline for research involving eye tracking (2023 edition).

Behav Res Methods

August 2024

Department of Neurology and Institute of Psychology II, Center of Brain, Behavior and Metabolism (CBBM), University of Luebeck, Luebeck, Germany.

A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years.

View Article and Find Full Text PDF

Eye movements in visual impairment.

Vision Res

October 2023

Visual Neuroscience Group, School of Psychology, University of Nottingham, University Park, Nottingham NG7 2RD, UK.

This Special Issue describes the impact of visual impairment on visuomotor function. It includes contributions that examine gaze control in conditions associated with abnormal visual development such as amblyopia, dyslexia and neurofibromatosis as well as disorders associated with field loss later in life, such as macular degeneration and stroke. Specifically, the papers address both gaze holding (fixation), and gaze-following behavior (single saccades, sequences of saccades and smooth-pursuit) that characterize active vision in daily life and evaluate the influence of both pathological and simulated field loss.

View Article and Find Full Text PDF

Manual gestures and speech form a single integrated system during native language comprehension. However, it remains unclear whether this hold for second language (L2) comprehension, more specifically for simultaneous interpreting (SI), which involves comprehension in one language and simultaneous production in another. In a combined mismatch and priming paradigm, we presented Swedish speakers fluent in L2 English with multimodal stimuli in which speech was congruent or incongruent with a gesture.

View Article and Find Full Text PDF

According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings.

View Article and Find Full Text PDF

A field test of computer-vision-based gaze estimation in psychology.

Behav Res Methods

March 2024

Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, the Netherlands.

Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g.

View Article and Find Full Text PDF

How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping).

View Article and Find Full Text PDF

Eye contact avoidance in crowds: A large wearable eye-tracking study.

Atten Percept Psychophys

November 2022

Experimental Psychology, Helmholtz Institute, Utrecht University, 3584CS, Utrecht, The Netherlands.

Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route.

View Article and Find Full Text PDF

Pupil-corneal reflection (P-CR) eye tracking has gained a prominent role in studying dog visual cognition, despite methodological challenges that often lead to lower-quality data than when recording from humans. In the current study, we investigated if and how the morphology of dogs might interfere with tracking of P-CR systems, and to what extent such interference, possibly in combination with dog-unique eye-movement characteristics, may undermine data quality and affect eye-movement classification when processed through algorithms. For this aim, we have conducted an eye-tracking experiment with dogs and humans, and investigated incidences of tracking interference, compared how they blinked, and examined how differential quality of dog and human data affected the detection and classification of eye-movement events.

View Article and Find Full Text PDF