AI Article Synopsis

  • Speakers adapt their speech movements based on whether an interlocutor is present, highlighting the importance of context in communication.
  • In a study, participants either mouthed or vocalized syllables while their lip and tongue movements were recorded, revealing that lip movement is greater when mouthing compared to vocalizing, while tongue movement showed mixed results.
  • The findings suggest that in the absence of auditory signals, speakers increase visual articulation, demonstrating the complex interplay between auditory and visual aspects of speech in social interactions.

Article Abstract

Speakers take into account what information a conversation partner requires in a given context in order to best understand an utterance. Despite growing evidence showing that movements of visible articulators such as the lips are augmented in mouthed speech relative to vocalized speech, little to date has been done comparing this effect in visible vs. non-visible articulators. In addition, no studies have examined whether interlocutor engagement differentially impacts these. Building on a basic present/not-present design, we investigated whether presence of audible speech information and/or an interlocutor affect the movements of the lips and the tongue. Participants were asked to a) speak or b) mouth three target syllables in interlocutor-present and interlocutor-not-present conditions, while lip and tongue movements were recorded using video and ultrasound imaging. Results show that lip protrusion was greater in mouthed conditions compared to vocalized ones and tongue movements were either attenuated (/wa/) or unaffected (/ri/, /ra/) by these same conditions, indicating differential effects for the visible and non-visible articulators in the absence of an auditory signal. A significant interaction between the social engagement and vocalizing conditions in reference to lip aperture showed that participants produced smaller lip apertures when vocalizing alone, as compared to when in the presence of an interlocutor. However, measures of lip protrusion failed to find an effect of social engagement. We conclude that speakers make use of both auditory and visual modalities in the presence of an interlocutor, and that when acoustic information is unavailable, compensatory increases are made in the visual domain. Our findings shed new light on the multimodal nature of speech, and pose new questions about differential adaptations made by visible and non-visible articulators in different speech conditions.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5104351PMC

Publication Analysis

Top Keywords

visible non-visible
16
non-visible articulators
16
movements visible
8
tongue movements
8
lip protrusion
8
social engagement
8
presence interlocutor
8
interlocutor
5
movements
5
visible
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!