This mini review is aimed at a clinician-scientist seeking to understand the role of oscillations in neural processing and their functional relevance in speech and music perception. We present an overview of neural oscillations, methods used to study them, and their functional relevance with respect to music processing, aging, hearing loss, and disorders affecting speech and language. We first review the oscillatory frequency bands and their associations with speech and music processing.
View Article and Find Full Text PDFPurpose Listeners shift their listening strategies between lower level acoustic information and higher level semantic information to prioritize maximum speech intelligibility in challenging listening conditions. Although increasing task demands via acoustic degradation modulates lexical-semantic processing, the neural mechanisms underlying different listening strategies are unclear. The current study examined the extent to which encoding of lower level acoustic cues is modulated by task demand and associations with lexical-semantic processes.
View Article and Find Full Text PDFPurpose Adults with stuttering (AWS) exhibit compromised phonological working memory abilities, poor central auditory processing, and impaired auditory processing especially during overt speech production tasks. However, these tasks are sensitive to language disturbances already found in them. Thus, in this study, monosyllables were used ruling out the language effects, and auditory working memory ability was evaluated in AWS using the n-back task.
View Article and Find Full Text PDFFolia Phoniatr Logop
December 2021
Background: Recent models of speech production suggest a link between speech production and perception. Persons with stuttering are known to have deficits in sensorimotor timing and exhibit auditory processing problems. Most of the earlier studies have focused on assessing temporal ordering in adults who stutter (AWS), but limited attempts have been made to document temporal resolution abilities in AWS.
View Article and Find Full Text PDFBackground And Objectives: The influence of visual stimulus on the auditory component in the perception of auditory-visual (AV) consonant-vowel syllables has been demonstrated in different languages. Inherent properties of unimodal stimuli are known to modulate AV integration. The present study investigated how the amount of McGurk effect (an outcome of AV integration) varies across three different consonant combinations in Kannada language.
View Article and Find Full Text PDF