AI Article Synopsis

  • - The paper discusses how changes in speech perception due to altered auditory feedback suggest a link between motor skills and hearing in speech processing, but the exact processes involved are still unclear.
  • - It proposes a Bayesian model to quantitatively evaluate how motor learning affects auditory perception, focusing on predictive relationships between speech production and perception.
  • - The analysis aims to understand shifts in perceptual boundaries after feedback changes, the degree of compensation when feedback is altered, and how these factors correlate, using experimental evidence to support its findings.

Article Abstract

Shifts in perceptual boundaries resulting from speech motor learning induced by perturbations of the auditory feedback were taken as evidence for the involvement of motor functions in auditory speech perception. Beyond this general statement, the precise mechanisms underlying this involvement are not yet fully understood. In this paper we propose a quantitative evaluation of some hypotheses concerning the motor and auditory updates that could result from motor learning, in the context of various assumptions about the roles of the auditory and somatosensory pathways in speech perception. This analysis was made possible thanks to the use of a Bayesian model that implements these hypotheses by expressing the relationships between speech production and speech perception in a joint probability distribution. The evaluation focuses on how the hypotheses can (1) predict the location of perceptual boundary shifts once the perturbation has been removed, (2) account for the magnitude of the compensation in presence of the perturbation, and (3) describe the correlation between these two behavioral characteristics. Experimental findings about changes in speech perception following adaptation to auditory feedback perturbations serve as reference. Simulations suggest that they are compatible with a framework in which motor adaptation updates both the auditory-motor internal model and the auditory characterization of the perturbed phoneme, and where perception involves both auditory and somatosensory pathways.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5794199PMC
http://dx.doi.org/10.1371/journal.pcbi.1005942DOI Listing

Publication Analysis

Top Keywords

speech perception
16
speech motor
8
evaluation hypotheses
8
motor learning
8
auditory feedback
8
auditory somatosensory
8
somatosensory pathways
8
speech
7
auditory
7
motor
6

Similar Publications

Comprehension of acoustically degraded emotional prosody in Alzheimer's disease and primary progressive aphasia.

Sci Rep

December 2024

Dementia Research Centre, Department of Neurodegenerative Disease, UCL Queen Square Institute of Neurology, University College London, 1st Floor, 8-11 Queen Square, London, WC1N 3AR, UK.

Previous research suggests that emotional prosody perception is impaired in neurodegenerative diseases like Alzheimer's disease (AD) and primary progressive aphasia (PPA). However, no previous research has investigated emotional prosody perception in these diseases under non-ideal listening conditions. We recruited 18 patients with AD, and 31 with PPA (nine logopenic (lvPPA); 11 nonfluent/agrammatic (nfvPPA) and 11 semantic (svPPA)), together with 24 healthy age-matched individuals.

View Article and Find Full Text PDF

Background: Theories highlight the important role of chronic stress in remodeling HPA-axis responsivity under stress. The Perceived Stress Scale (PSS) is one of the most widely used measures of enduring stress perceptions, and no previous studies have evaluated whether greater perceptions of stress on the PSS are associated with cortisol hypo- or hyperactivity responses to the Trier Social Stress Test (TSST).

Objective: To examine if high perceived stress over the past month, as measured by the PSS, alters cortisol and subjective acute stress reactivity to the TSST in healthy young adults.

View Article and Find Full Text PDF

Multi-talker speech intelligibility requires successful separation of the target speech from background speech. Successful speech segregation relies on bottom-up neural coding fidelity of sensory information and top-down effortful listening. Here, we studied the interaction between temporal processing measured using Envelope Following Responses (EFRs) to amplitude modulated tones, and pupil-indexed listening effort, as it related to performance on the Quick Speech-in-Noise (QuickSIN) test in normal-hearing adults.

View Article and Find Full Text PDF

How Does Deep Neural Network-Based Noise Reduction in Hearing Aids Impact Cochlear Implant Candidacy?

Audiol Res

December 2024

Division of Audiology, Department of Otolaryngology-Head and Neck Surgery, Mayo Clinic, Rochester, MN 55902, USA.

Background/objectives: Adult hearing-impaired patients qualifying for cochlear implants typically exhibit less than 60% sentence recognition under the best hearing aid conditions, either in quiet or noisy environments, with speech and noise presented through a single speaker. This study examines the influence of deep neural network-based (DNN-based) noise reduction on cochlear implant evaluation.

Methods: Speech perception was assessed using AzBio sentences in both quiet and noisy conditions (multi-talker babble) at 5 and 10 dB signal-to-noise ratios (SNRs) through one loudspeaker.

View Article and Find Full Text PDF

Background/objectives: Understanding speech in background noise is a challenging task for listeners with normal hearing and even more so for individuals with hearing impairments. The primary objective of this study was to develop Romanian speech material in noise to assess speech perception in diverse auditory populations, including individuals with normal hearing and those with various types of hearing loss. The goal was to create a versatile tool that can be used in different configurations and expanded for future studies examining auditory performance across various populations and rehabilitation methods.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!