An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents.

Front Psychol

Departamento de Psicología Evolutiva y de la Educación, Universidad de SevillaSeville, Spain.

Published: June 2017

An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf ( = 25) and hearing ( = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non-native signers looked mainly at the face when perceiving sign language, although non-native signers looked significantly more at the body than native signers. This distribution of gaze fixations suggested that deaf individuals - particularly native signers - mainly perceived signs through peripheral vision.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5478736PMC
http://dx.doi.org/10.3389/fpsyg.2017.01044DOI Listing

Publication Analysis

Top Keywords

sign language
20
native signers
16
spoken language
16
deaf participants
16
hearing participants
16
deaf
14
deaf individuals
12
participants tested
12
non-native signers
12
language
11

Similar Publications

Purpose: The present study assessed the test-retest reliability of the American Sign Language (ASL) version of the Computerized Revised Token Test (CRTT-ASL) and compared the differences and similarities between ASL and English reading by Deaf and hearing users of ASL.

Method: Creation of the CRTT-ASL involved filming, editing, and validating CRTT instructions, sentence commands, and scoring. Deaf proficient (DP), hearing nonproficient (HNP), and hearing proficient sign language users completed the CRTT-ASL and the English self-paced, word-by-word reading CRTT (CRTT-Reading-Word Fade [CRTT-R-wf]).

View Article and Find Full Text PDF

Background: Chinese cancer survivors are not doing well in returning to work. Peer support, as an external coping resource to help cancer survivors return to work, brings together members of the lay community with similar stressors or problems for mutual support. Peer volunteers have not received systematic training, so inappropriate language in the support process can often cause secondary damage to both the peer and the cancer survivor.

View Article and Find Full Text PDF

Background: Individuals with hearing impairments may face hindrances in health care assistance, which may significantly impact the prognosis and the incidence of complications and iatrogenic events. Therefore, the development of automatic communication systems to assist the interaction between this population and health care workers is paramount.

Objective: This study aims to systematically review the evidence on communication systems using human-computer interaction techniques developed for deaf people who communicate through sign language that are already in use or proposed for use in health care contexts and have been tested with human users or videos of human users.

View Article and Find Full Text PDF

Research shows that insufficient language access in early childhood significantly affects language processing. While the majority of this work focuses on syntax, phonology also appears to be affected, though it is unclear exactly how. Here we investigated phonological production across age of acquisition of American Sign Language (ASL).

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!