Gestures are an important part of human communication. However, little is known about the neural correlates of gestures accompanying speech comprehension. The goal of this study is to investigate the neural basis of speech-gesture interaction as reflected in activation increase and decrease during observation of natural communication. Fourteen German participants watched video clips of 5 s duration depicting an actor who performed metaphoric gestures to illustrate the abstract content of spoken sentences. Furthermore, video clips of isolated gestures (without speech), isolated spoken sentences (without gestures) and gestures in the context of an unknown language (Russian) were additionally presented while functional magnetic resonance imaging (fMRI) data were acquired. Bimodal speech and gesture processing led to left hemispheric activation increases of the posterior middle temporal gyrus, the premotor cortex, the inferior frontal gyrus, and the right superior temporal sulcus. Activation reductions during the bimodal condition were located in the left superior temporal gyrus and the left posterior insula. Gesture related activation increases and decreases were dependent on language semantics and were not found in the unknown-language condition. Our results suggest that semantic integration processes for bimodal speech plus gesture comprehension are reflected in activation increases in the classical left hemispheric language areas. Speech related gestures seem to enhance language comprehension during the face-to-face communication.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neuropsychologia.2008.08.009DOI Listing

Publication Analysis

Top Keywords

speech gesture
12
activation increases
12
gestures
8
gestures gestures
8
reflected activation
8
video clips
8
spoken sentences
8
bimodal speech
8
left hemispheric
8
temporal gyrus
8

Similar Publications

Value Added by Assessing Nonspoken Vocabulary in Minimally Speaking Autistic Children.

Am J Speech Lang Pathol

January 2025

School of Communication Sciences and Disorders, McGill University, Montreal, Quebec, Canada.

Purpose: There is a scarcity of language assessment tools properly adapted for use with minimally speaking autistic children. As these children often use nonspoken methods of communication (i.e.

View Article and Find Full Text PDF

Syllable as a Synchronization Mechanism That Makes Human Speech Possible.

Brain Sci

December 2024

Department of Speech, Hearing and Phonetic Sciences, Division of Psychology and Language Sciences, University College London, Chandler House 2 Wakefield Street, London WC1N 1PF, UK.

Speech is a highly skilled motor activity that shares a core problem with other motor skills: how to reduce the massive degrees of freedom (DOF) to the extent that the central nervous control and learning of complex motor movements become possible. It is hypothesized in this paper that a key solution to the DOF problem is to eliminate most of the temporal degrees of freedom by synchronizing concurrent movements, and that this is performed in speech through the syllable-a mechanism that synchronizes consonantal, vocalic, and laryngeal gestures. Under this hypothesis, syllable articulation is enabled by three basic mechanisms: target approximation, edge-synchronization, and tactile anchoring.

View Article and Find Full Text PDF

The dataset represents a significant advancement in Bengali lip-reading and visual speech recognition research, poised to drive future applications and technological progress. Despite Bengali's global status as the seventh most spoken language with approximately 265 million speakers, linguistically rich and widely spoken languages like Bengali have been largely overlooked by the research community. fills this gap by offering a pioneering dataset tailored for Bengali lip-reading, comprising visual data from 150 speakers across 54 classes, encompassing Bengali phonemes, alphabets, and symbols.

View Article and Find Full Text PDF

Examining Concurrent Associations Between Gesture Use, Developmental Domains, and Autistic Traits in Toddlers With Down Syndrome.

J Speech Lang Hear Res

January 2025

Down Syndrome Program, Division of Developmental Medicine, Department of Pediatrics, Boston Children's Hospital, Harvard Medical School, MA.

Purpose: Toddlers with Down syndrome (DS) showcase comparable or higher rates of gestures than chronological age- and language-matched toddlers without DS. Little is known about how gesture use in toddlers with DS relates to multiple domains of development, including motor, pragmatics, language, and visual reception (VR) skills. Unexplored is whether gesture use is a good marker of social communication skills in DS or if gesture development might be more reliably a marker of motor, language, pragmatics, or VR skills.

View Article and Find Full Text PDF

Despite increased attempts to express equality in speech, biases often leak out through subtle linguistic cues. For example, the subject-complement statement (SCS, "Girls are as good as boys at math") is used to advocate for equality but often reinforces gender stereotypes (boys are the standard against which girls are judged). We ask whether stereotypes conveyed by SCS can be counteracted by gesture.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!