Exploring the role of hand gestures in learning novel phoneme contrasts and vocabulary in a second language.

Front Psychol

Center for Language and Brain, Colgate University Hamilton, NY, USA ; Department of East Asian Languages and Literatures, Colgate University Hamilton, NY, USA.

Published: July 2014

Co-speech hand gestures are a type of multimodal input that has received relatively little attention in the context of second language learning. The present study explored the role that observing and producing different types of gestures plays in learning novel speech sounds and word meanings in an L2. Naïve English-speakers were taught two components of Japanese-novel phonemic vowel length contrasts and vocabulary items comprised of those contrasts-in one of four different gesture conditions: Syllable Observe, Syllable Produce, Mora Observe, and Mora Produce. Half of the gestures conveyed intuitive information about syllable structure, and the other half, unintuitive information about Japanese mora structure. Within each Syllable and Mora condition, half of the participants only observed the gestures that accompanied speech during training, and the other half also produced the gestures that they observed along with the speech. The main finding was that participants across all four conditions had similar outcomes in two different types of auditory identification tasks and a vocabulary test. The results suggest that hand gestures may not be well suited for learning novel phonetic distinctions at the syllable level within a word, and thus, gesture-speech integration may break down at the lowest levels of language processing and learning.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4077026PMC
http://dx.doi.org/10.3389/fpsyg.2014.00673DOI Listing

Publication Analysis

Top Keywords

hand gestures
12
learning novel
12
contrasts vocabulary
8
second language
8
gestures
7
learning
5
syllable
5
exploring role
4
role hand
4
gestures learning
4

Similar Publications

The dynamic Colombian sign language dataset for basic conversation LSC70.

Data Brief

February 2025

Sistemas dinámicos, instrumentación y control (SIDICO), Departamento de física, Universidad del Cauca, Colombia.

Sign language is a form of non-verbal communication used by people with hearing disability. This form of communication relies on the use of signs, gestures, facial expressions, and more. Considering that in Colombia, the population with hearing impairments is around half a million, a database of dynamic, alphanumeric signs and commonly used words was created to establish a basic conversation.

View Article and Find Full Text PDF

Hand gestures provide an alternate interaction modality for blind users and can be supported using commodity smartwatches without requiring specialized sensors. The enabling technology is an accurate gesture recognition algorithm, but almost all algorithms are designed for sighted users. Our study shows that blind user gestures are considerably diferent from sighted users, rendering current recognition algorithms unsuitable.

View Article and Find Full Text PDF

The role of the left primary motor cortex in apraxia.

Neurol Res Pract

January 2025

Department of Neurology, Faculty of Medicine and University Hospital Cologne, University of Cologne, Kerpener Str. 62, 50937, Cologne, Germany.

Background: Apraxia is a motor-cognitive disorder that primary sensorimotor deficits cannot solely explain. Previous research in stroke patients has focused on damage to the fronto-parietal praxis networks in the left hemisphere (LH) as the cause of apraxic deficits. In contrast, the potential role of the (left) primary motor cortex (M1) has largely been neglected.

View Article and Find Full Text PDF

: Gestural production, a crucial aspect of nonverbal communication, plays a key role in the development of verbal and socio-communicative skills. Delays in gestural development often impede verbal acquisition and social interaction in children with Autism Spectrum Disorder (ASD). Although various interventions for ASD focus on improving socio-communicative abilities, they consistently highlight the importance of integrating gestures to support overall communication development.

View Article and Find Full Text PDF

Liquid-Metal-Based Multichannel Strain Sensor for Sign Language Gesture Classification Using Machine Learning.

ACS Appl Mater Interfaces

January 2025

Centre for Robotics and Automation, Department of Biomedical Engineering, City University of Hong Kong, Hong Kong 999077, China.

Liquid metals are highly conductive like metallic materials and have excellent deformability due to their liquid state, making them rather promising for flexible and stretchable wearable sensors. However, patterning liquid metals on soft substrates has been a challenge due to high surface tension. In this paper, a new method is proposed to overcome the difficulties in fabricating liquid-state strain sensors.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!