Publications by authors named "Katie Bicevskis"

Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al.

View Article and Find Full Text PDF

Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over audio stimuli alone.

View Article and Find Full Text PDF
Article Synopsis
  • Speakers adapt their speech movements based on whether an interlocutor is present, highlighting the importance of context in communication.
  • In a study, participants either mouthed or vocalized syllables while their lip and tongue movements were recorded, revealing that lip movement is greater when mouthing compared to vocalizing, while tongue movement showed mixed results.
  • The findings suggest that in the absence of auditory signals, speakers increase visual articulation, demonstrating the complex interplay between auditory and visual aspects of speech in social interactions.
View Article and Find Full Text PDF