Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model.

Front Psychol

Linnaeus Centre HEAD, Swedish Institute for Disability Research, Department of Behavioural Sciences and Learning, Linköping University Linköping, Sweden.

Published: February 2016

Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into account. These results demonstrate that experience of sign language enhances the ability to imitate manual gestures once representations have been established, and suggest that the inherent motor patterns of lexical manual gestures are better suited for representation than those of non-signs. This set of findings prompts a developmental version of the ELU model, D-ELU.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4754574PMC
http://dx.doi.org/10.3389/fpsyg.2016.00107DOI Listing

Publication Analysis

Top Keywords

sign language
28
manual gestures
24
predicted imitation
16
language
14
language understanding
12
dhh signing
12
non-signing children
12
imitation
11
gestures
9
ease language
8

Similar Publications

Background: Isocitrate dehydrogenase (IDH) wild-type (IDH) glioblastomas (GB) are more aggressive and have a poorer prognosis than IDH mutant (IDH) tumors, emphasizing the need for accurate preoperative differentiation. However, a distinct imaging biomarker for differentiation mostly lacking. Intratumoral thrombosis has been reported as a histopathological biomarker for GB.

View Article and Find Full Text PDF

Human-machine interfaces and wearable electronics, as fundamentals to achieve human-machine interactions, are becoming increasingly essential in the era of the Internet of Things. However, contemporary wearable sensors based on resistive and capacitive mechanisms demand an external power, impeding them from extensive and diverse deployment. Herein, a smart wearable system is developed encompassing five arch-structured self-powered triboelectric sensors, a five-channel data acquisition unit to collect finger bending signals, and an artificial intelligence (AI) methodology, specifically a long short-term memory (LSTM) network, to recognize signal patterns.

View Article and Find Full Text PDF

Background/objectives: Specific tests for the assessment of language development and language skills in deaf children are scarce. For this reason, parent inventories and/or standardized tests that are reliable and valid in the hearing population are used. The main aim of this study was to assess the usefulness of the Clinical Evaluation of Language Fundamentals 5 (CELF5) in determining the language skills of hearing-impaired children in a comprehensive way in comparison to their hearing peers.

View Article and Find Full Text PDF

Liquid-Metal-Based Multichannel Strain Sensor for Sign Language Gesture Classification Using Machine Learning.

ACS Appl Mater Interfaces

January 2025

Centre for Robotics and Automation, Department of Biomedical Engineering, City University of Hong Kong, Hong Kong 999077, China.

Liquid metals are highly conductive like metallic materials and have excellent deformability due to their liquid state, making them rather promising for flexible and stretchable wearable sensors. However, patterning liquid metals on soft substrates has been a challenge due to high surface tension. In this paper, a new method is proposed to overcome the difficulties in fabricating liquid-state strain sensors.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!