Seeing to hear better: evidence for early audio-visual interactions in speech identification.

Cognition

Institut de la Communication Parlée, CNRS-INPG-Université Stendhal, 46 Av. Félix Viallet, 38031 Grenoble 1, France.

Published: September 2004

AI Article Synopsis

  • Lip reading helps people understand speech better by watching the speaker's lips, especially in noisy environments.
  • Recent experiments show that seeing lips can improve sensitivity to sound, making it easier to detect speech in noise.
  • The study found that visual information significantly enhances speech intelligibility compared to just audio, highlighting its importance in understanding communication.

Article Abstract

Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and it remains to be seen whether improved sensitivity also results in an intelligibility gain in audio-visual speech perception. In this work, we use an original paradigm to show that seeing the speaker's lips enables the listener to hear better and hence to understand better. The audio-visual stimuli used here could not be differentiated by lip reading per se since they contained exactly the same lip gesture matched with different compatible speech sounds. Nevertheless, the noise-masked stimuli were more intelligible in the audio-visual condition than in the audio-only condition due to the contribution of visual information to the extraction of acoustic cues. Replacing the lip gesture by a non-speech visual input with exactly the same time course, providing the same temporal cues for extraction, removed the intelligibility benefit. This early contribution to audio-visual speech identification is discussed in relationships with recent neurophysiological data on audio-visual perception.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cognition.2004.01.006DOI Listing

Publication Analysis

Top Keywords

speaker's lips
12
hear better
8
speech identification
8
lip reading
8
audio-visual perception
8
acoust soc
8
audio-visual speech
8
lip gesture
8
audio-visual
7
speech
7

Similar Publications

Temporal Trends of Ischemic Stroke Risk in Patients With Incident Atrial Fibrillation Before Anticoagulation.

JACC Clin Electrophysiol

December 2024

Department of Medicine, University of Helsinki, Helsinki, Finland; Department of Internal Medicine, Jorvi Hospital, Espoo, Finland; Department of Internal Medicine, Helsinki University Hospital, Helsinki, Finland.

Background: Atrial fibrillation (AF) is a major risk factor for ischemic stroke (IS), but whether the magnitude of this risk has changed over time is unknown.

Objectives: This study sought to investigate temporal trends in IS rates in patients with incident AF before oral anticoagulant agent (OAC) therapy.

Methods: The nationwide FinACAF (Finnish Anticoagulation in Atrial Fibrillation) study covers patients with AF at all levels of care in Finland from 2007 to 2018.

View Article and Find Full Text PDF

Background: Few data are available about the impact of oral anticoagulants (OAC) in patients with Atrial Fibrillation (AF) and clinical complexity (CC).

Methods: We conducted a retrospective study utilising data from the TriNetX network. Based on ICD-10-CM codes entered between 2020 and 2022, AF patients aged ≥75 years on long-term OAC with CC were categorised into two groups based on OAC use in the year before entering the study (maintained vs discontinued).

View Article and Find Full Text PDF

The impact of atrial fibrillation and smoking history on brain deficits.

Am J Cardiol

December 2024

Liverpool Centre for Cardiovascular Science, at University of Liverpool, Liverpool John Moores University and Liverpool Heart & Chest Hospital, Liverpool, United Kingdom; Danish Centre for Health Services Research, Department of Clinical Medicine, Aalborg University, Aalborg, Denmark; Medical University of Bialystok, Bialystok, Poland. Electronic address:

View Article and Find Full Text PDF

This paper investigates the impact of two non-technical speech feedback perturbations outside the auditory modality: topical application of commercially-available benzocaine to reduce somatosensory feedback from speakers' lips and tongue tip, and the presence of a mirror to provide fully-detailed visual self-feedback. In experiment 1, speakers were recorded under normal quiet conditions (i.e.

View Article and Find Full Text PDF

Background: Specific risk predictor scores of intracranial hemorrhage (ICH) risk in Asian subjects are lacking. We determined the incidence rate and predictors of ICH in patients with non-valvular atrial fibrillation (AF).

Methods: A prospective nationwide registry of patients with AF was conducted from 27 hospitals in Thailand.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!