Publications by authors named "Hermann Ackermann"

Nonspeech (or paraspeech) parameters are widely used in clinical assessment of speech impairment in persons with dysarthria (PWD). Virtually every standard clinical instrument used in dysarthria diagnostics includes nonspeech parameters, often in considerable numbers. While theoretical considerations have challenged the validity of these measures as markers of speech impairment, only a few studies have directly examined their relationship to speech parameters on a broader scale.

View Article and Find Full Text PDF

This review article summarizes various functions of the dorsolateral prefrontal cortex (DLPFC) that are related to language processing. To this end, its connectivity with the left-dominant perisylvian language network was considered, as well as its interaction with other functional networks that, directly or indirectly, contribute to language processing. Language-related functions of the DLPFC comprise various aspects of pragmatic processing such as discourse management, integration of prosody, interpretation of nonliteral meanings, inference making, ambiguity resolution, and error repair.

View Article and Find Full Text PDF

Cross-correlation of magnetoencephalography (MEG) with time courses derived from the speech signal has shown differences in phase-locking between blind subjects able to comprehend accelerated speech and sighted controls. The present training study contributes to disentangle the effects of blindness and training. Both subject groups (baseline: n = 16 blind, 13 sighted; trained: 10 blind, 3 sighted) were able to enhance speech comprehension up to ca.

View Article and Find Full Text PDF

The pre-supplementary motor area (pre-SMA) is engaged in speech comprehension under difficult circumstances such as poor acoustic signal quality or time-critical conditions. Previous studies found that left pre-SMA is activated when subjects listen to accelerated speech. Here, the functional role of pre-SMA was tested for accelerated speech comprehension by inducing a transient "virtual lesion" using continuous theta-burst stimulation (cTBS).

View Article and Find Full Text PDF

Speech is one of the most unique features of human communication. Our ability to articulate our thoughts by means of speech production depends critically on the integrity of the motor cortex. Long thought to be a low-order brain region, exciting work in the past years is overturning this notion.

View Article and Find Full Text PDF

Apart from its function in speech motor control, the supplementary motor area (SMA) has largely been neglected in models of speech and language processing in the brain. The aim of this review paper is to summarize more recent work, suggesting that the SMA has various superordinate control functions during speech communication and language reception, which is particularly relevant in case of increased task demands. The SMA is subdivided into a posterior region serving predominantly motor-related functions (SMA proper) whereas the anterior part (pre-SMA) is involved in higher-order cognitive control mechanisms.

View Article and Find Full Text PDF

Discourse structure enables us to generate expectations based upon linguistic material that has already been introduced. The present magnetoencephalography (MEG) study addresses auditory perception of test sentences in which discourse coherence was manipulated by using presuppositions (PSP) that either correspond or fail to correspond to items in preceding context sentences with respect to uniqueness and existence. Context violations yielded delayed auditory M50 and enhanced auditory M200 cross-correlation responses to syllable onsets within an analysis window of 1.

View Article and Find Full Text PDF

In many functional magnetic resonance imaging (fMRI) studies blind humans were found to show cross-modal reorganization engaging the visual system in non-visual tasks. For example, blind people can manage to understand (synthetic) spoken language at very high speaking rates up to ca. 20 syllables/s (syl/s).

View Article and Find Full Text PDF

Late-blind humans can learn to understand speech at ultra-fast syllable rates (ca. 20 syllables/s), a capability associated with hemodynamic activation of the central-visual system. Thus, the observed functional cross-modal recruitment of occipital cortex might facilitate ultra-fast speech processing in these individuals.

View Article and Find Full Text PDF

Patterns of dysarthria in spinocerebellar ataxias (SCAs) and their discriminative features still remain elusive. Here we aimed to compare dysarthria profiles of patients with (SCA3 and SCA6 vs. Friedreich ataxia (FRDA), focussing on three particularly vulnerable speech parameters (speaking rate, prosodic modulation, and intelligibility) in ataxic dysarthria as well as on a specific oral non-speech variable of ataxic impairment, i.

View Article and Find Full Text PDF

Any account of "what is special about the human brain" (Passingham 2008) must specify the neural basis of our unique ability to produce speech and delineate how these remarkable motor capabilities could have emerged in our hominin ancestors. Clinical data suggest that the basal ganglia provide a platform for the integration of primate-general mechanisms of acoustic communication with the faculty of articulate speech in humans. Furthermore, neurobiological and paleoanthropological data point at a two-stage model of the phylogenetic evolution of this crucial prerequisite of spoken language: (i) monosynaptic refinement of the projections of motor cortex to the brainstem nuclei that steer laryngeal muscles, presumably, as part of a "phylogenetic trend" associated with increasing brain size during hominin evolution; (ii) subsequent vocal-laryngeal elaboration of cortico-basal ganglia circuitries, driven by human-specific FOXP2 mutations.

View Article and Find Full Text PDF

The processing of nonverbal auditory stimuli has not yet been sufficiently investigated in patients with aphasia. On the basis of a duration discrimination task, we examined whether patients with left-sided cerebrovascular lesions were able to perceive time differences in the scale of approximately 150 ms. Further linguistic and memory-related tasks were used to characterize more exactly the relationships in the performances between auditory nonverbal task and selective linguistic or mnemonic disturbances.

View Article and Find Full Text PDF

Functional imaging demonstrated hemodynamic activation within specific brain areas that contribute to frequency-dependent movement control. Previous investigations demonstrated a linear relationship between movement and hemodynamic response rates within cortical regions, whereas the basal ganglia displayed an inverse neural activation pattern. We now investigated neural correlates of frequency-related finger movements in patients with Parkinson's disease (PD) to further elucidate the neurofunctional alterations in cortico-subcortical networks in that disorder.

View Article and Find Full Text PDF

Individuals suffering from vision loss of a peripheral origin may learn to understand spoken language at a rate of up to about 22 syllables (syl) per seconds (s)-exceeding by far the maximum performance level of untrained listeners (ca. 8 syl/s). Previous findings indicate the central-visual system to contribute to the processing of accelerated speech in blind subjects.

View Article and Find Full Text PDF

In blind people, the visual channel cannot assist face-to-face communication via lipreading or visual prosody. Nevertheless, the visual system may enhance the evaluation of auditory information due to its cross-links to (1) the auditory system, (2) supramodal representations, and (3) frontal action-related areas. Apart from feedback or top-down support of, for example, the processing of spatial or phonological representations, experimental data have shown that the visual system can impact auditory perception at more basic computational stages such as temporal signal resolution.

View Article and Find Full Text PDF

The aim of this article is to explicate the uniqueness of the motor activity implied in spoken language production and to emphasize how important it is, from a theoretical and a clinical perspective, to consider the motor events associated with speaking as domain-specific, i.e., as pertaining to the domain of linguistic expression.

View Article and Find Full Text PDF

Background: Individuals suffering from vision loss of a peripheral origin may learn to understand spoken language at a rate of up to about 22 syllables (syl) per second - exceeding by far the maximum performance level of normal-sighted listeners (ca. 8 syl/s). To further elucidate the brain mechanisms underlying this extraordinary skill, functional magnetic resonance imaging (fMRI) was performed in blind subjects of varying ultra-fast speech comprehension capabilities and sighted individuals while listening to sentence utterances of a moderately fast (8 syl/s) or ultra-fast (16 syl/s) syllabic rate.

View Article and Find Full Text PDF
Neurophonetics.

Wiley Interdiscip Rev Cogn Sci

March 2013

Neurophonetics aims at the elucidation of the brain mechanisms underlying speech communication in our species. Clinical observations in patients with speech impairments following cerebral disorders provided the initial vantage point of this research area and indicated distinct functional-neuroanatomic systems to support human speaking and listening. Subsequent approaches-considering speech production a motor skill-investigated vocal tract movements associated with spoken language by means of kinematic and electromyographic techniques-allowing, among other things, for the evaluation of computational models suggesting elementary phonological gestures or a mental syllabary as basic units of speech motor control.

View Article and Find Full Text PDF

Blind people can learn to understand speech at ultra-high syllable rates (ca. 20 syllables/s), a capability associated with hemodynamic activation of the central-visual system. To further elucidate the neural mechanisms underlying this skill, magnetoencephalographic (MEG) measurements during listening to sentence utterances were cross-correlated with time courses derived from the speech signal (envelope, syllable onsets and pitch periodicity) to capture phase-locked MEG components (14 blind, 12 sighted subjects; speech rate=8 or 16 syllables/s, pre-defined source regions: auditory and visual cortex, inferior frontal gyrus).

View Article and Find Full Text PDF

Friedreich ataxia (FRDA) is the most frequent recessive ataxia in the Western world. Dysarthria is a cardinal feature of FRDA, often leading to severe impairments in daily functioning, but its exact characteristics are only poorly understood so far. We performed a comprehensive evaluation of dysarthria severity and the profile of speech motor deficits in 20 patients with a genetic diagnosis of FRDA based on a carefully selected battery of speaking tasks and two widely used paraspeech tasks, i.

View Article and Find Full Text PDF

Individual differences in second language (L2) aptitude have been assumed to depend upon a variety of cognitive and personality factors. Especially, the cognitive factor phonological working memory has been conceptualised as language learning device. However, strong associations between phonological working memory and L2 aptitude have been previously found in early-stage learners only, not in advanced learners.

View Article and Find Full Text PDF

The Asperger syndrome (AS) includes impaired recognition of other people's mental states. Since language-based diagnostic procedures may be confounded by cognitive-linguistic compensation strategies, nonverbal test materials were created, including human affective and vegetative sounds. Depending on video context, each sound could be interpreted either as direct expression of an agent's affective/vegetative state or as result of intentional-executive mental operations.

View Article and Find Full Text PDF

Purpose: The aim of this study was to assess the brain regions associated with impaired performance in a virtual, dynamic collision avoidance task, in a group of patients with homonymous visual field defects (HVFDs) because of unilateral vascular brain lesions.

Methods: Overall task performance was quantitatively assessed as the number of collisions while crossing an intersection at two levels of traffic density. Twenty-six patients were divided into two subgroups using the median split method: patients with 'performance above average' (HVFD(A), i.

View Article and Find Full Text PDF

During speech perception, acoustic correlates of syllable structure and pitch periodicity are directly reflected in electrophysiological brain activity. Magnetoencephalography (MEG) recordings were made while 10 participants listened to natural or formant-synthesized speech at moderately fast or ultrafast rate. Cross-correlation analysis was applied to show brain activity time-locked to the speech envelope, to an acoustic marker of syllable onsets, and to pitch periodicity.

View Article and Find Full Text PDF