In the contexts of language learning and music processing, hand gestures conveying acoustic information visually influence perception of speech and non-speech sounds (Connell et al., 2013; Morett & Chang, 2015). Currently, it is unclear whether this effect is due to these gestures' use of the human body to highlight relevant features of language (embodiment) or the cross-modal mapping between the visual motion trajectories of these gestures and corresponding auditory features (conceptual metaphor). To address this question, we examined identification of the pitch contours of lexical tones and non-speech analogs learned with pitch gesture, comparable dot motion, or no motion. Critically, pitch gesture and dot motion were either congruent or incongruent with the vertical conceptual metaphor of pitch. Consistent with our hypotheses, we found that identification accuracy increased for tones learned with congruent pitch gesture and dot motion, whereas it remained stable or decreased for tones learned with incongruent pitch gesture and dot motion. These findings provide the first evidence that both embodied and non-embodied visual stimuli congruent with the vertical conceptual metaphor of pitch enhance lexical and non-speech tone learning. Thus, they illuminate the influences of conceptual metaphor and embodiment on lexical and non-speech auditory perception, providing insight into how they can be leveraged to enhance language learning and music processing.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cognition.2022.105014DOI Listing

Publication Analysis

Top Keywords

conceptual metaphor
20
pitch gesture
16
dot motion
16
lexical non-speech
12
gesture dot
12
non-speech tone
8
tone learning
8
language learning
8
learning music
8
music processing
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!