Dynamical theories of speech processing propose that the auditory cortex parses acoustic information in parallel at the syllabic and phonemic timescales. We developed a paradigm to independently manipulate both linguistic timescales, and acquired intracranial recordings from 11 patients who are epileptic listening to French sentences. Our results indicate that (i) syllabic and phonemic timescales are both reflected in the acoustic spectral flux; (ii) during comprehension, the auditory cortex tracks the syllabic timescale in the theta range, while neural activity in the alpha-beta range phase locks to the phonemic timescale; (iii) these neural dynamics occur simultaneously and share a joint spatial location; (iv) the spectral flux embeds two timescales-in the theta and low-beta ranges-across 17 natural languages.
View Article and Find Full Text PDFAlpha oscillations in the auditory cortex have been associated with attention and the suppression of irrelevant information. However, their anatomical organization and interaction with other neural processes remain unclear. Do alpha oscillations function as a local mechanism within most neural sources to regulate their internal excitation/inhibition balance, or do they belong to separated inhibitory sources gating information across the auditory network? To address this question, we acquired intracerebral electrophysiological recordings from epilepsy patients during rest and tones listening.
View Article and Find Full Text PDFBackground music is widely used to sustain attention, but little is known about what musical properties aid attention. This may be due to inter-individual variability in neural responses to music. Here we find that music with amplitude modulations added at specific rates can sustain attention differentially for those with varying levels of attentional difficulty.
View Article and Find Full Text PDFWhat is the function of auditory hemispheric asymmetry? We propose that the identification of sound sources relies on the asymmetric processing of two complementary and perceptually relevant acoustic invariants: actions and objects. In a large dataset of environmental sounds, we observed that temporal and spectral modulations display only weak covariation. We then synthesized auditory stimuli by simulating various actions (frictions) occurring on different objects (solid surfaces).
View Article and Find Full Text PDFTo what extent does speech and music processing rely on domain-specific and domain-general neural networks? Using whole-brain intracranial EEG recordings in 18 epilepsy patients listening to natural, continuous speech or music, we investigated the presence of frequency-specific and network-level brain activity. We combined it with a statistical approach in which a clear operational distinction is made between , and domain- neural responses. We show that the majority of focal and network-level neural activity is shared between speech and music processing.
View Article and Find Full Text PDFTiming and motor function share neural circuits and dynamics, which underpin their close and synergistic relationship. For instance, the temporal predictability of a sensory event optimizes motor responses to that event. Knowing when an event is likely to occur lowers response thresholds, leading to faster and more efficient motor behavior though in situations of response conflict can induce impulsive and inappropriate responding.
View Article and Find Full Text PDFSpeech comprehension is enhanced when preceded (or accompanied) by a congruent rhythmic prime reflecting the metrical sentence structure. Although these phenomena have been described for auditory and motor primes separately, their respective and synergistic contribution has not been addressed. In this experiment, participants performed a speech comprehension task on degraded speech signals that were preceded by a rhythmic prime that could be auditory, motor or audiomotor.
View Article and Find Full Text PDFSpeech and music are two fundamental modes of human communication. Lateralisation of key processes underlying their perception has been related both to the distinct sensitivity to low-level spectrotemporal acoustic features and to top-down attention. However, the interplay between bottom-up and top-down processes needs to be clarified.
View Article and Find Full Text PDFWhy do humans spontaneously dance to music? To test the hypothesis that motor dynamics reflect predictive timing during music listening, we created melodies with varying degrees of rhythmic predictability (syncopation) and asked participants to rate their wanting-to-move (groove) experience. Degree of syncopation and groove ratings are quadratically correlated. Magnetoencephalography data showed that, while auditory regions track the rhythm of melodies, beat-related 2-hertz activity and neural dynamics at delta (1.
View Article and Find Full Text PDFHumans are expert at processing speech but how this feat is accomplished remains a major question in cognitive neuroscience. Capitalizing on the concept of channel capacity, we developed a unified measurement framework to investigate the respective influence of seven acoustic and linguistic features on speech comprehension, encompassing acoustic, sub-lexical, lexical and supra-lexical levels of description. We show that comprehension is independently impacted by all these features, but at varying degrees and with a clear dominance of the syllabic rate.
View Article and Find Full Text PDFCategorising voices is crucial for auditory-based social interactions. A recent study by Rupp and colleagues in PLOS Biology capitalises on human intracranial recordings to describe the spatiotemporal pattern of neural activity leading to voice-selective responses in associative auditory cortex.
View Article and Find Full Text PDFCortical oscillations have been proposed to play a functional role in speech and music perception, attentional selection, and working memory, via the mechanism of neural entrainment. One of the properties of neural entrainment that is often taken for granted is that its modulatory effect on ongoing oscillations outlasts rhythmic stimulation. We tested the existence of this phenomenon by studying cortical neural oscillations during and after presentation of melodic stimuli in a passive perception paradigm.
View Article and Find Full Text PDFObjectives: Rhythmic body rocking movements may occur in prefrontal epileptic seizures. Here, we compare quantified time-evolving frequency of stereotyped rocking with signal analysis of intracerebral electroencephalographic data.
Methods: In a single patient, prefrontal seizures with rhythmic anteroposterior body rocking recorded on stereoelectroencephalography (SEEG) were analyzed using fast Fourier transform, time-frequency decomposition and phase amplitude coupling, with regards to quantified video data.
Neural oscillations in auditory cortex are argued to support parsing and representing speech constituents at their corresponding temporal scales. Yet, how incoming sensory information interacts with ongoing spontaneous brain activity, what features of the neuronal microcircuitry underlie spontaneous and stimulus-evoked spectral fingerprints, and what these fingerprints entail for stimulus encoding, remain largely open questions. We used a combination of human invasive electrophysiology, computational modeling and decoding techniques to assess the information encoding properties of brain activity and to relate them to a plausible underlying neuronal microarchitecture.
View Article and Find Full Text PDFSpeech perception is mediated by both left and right auditory cortices but with differential sensitivity to specific acoustic information contained in the speech signal. A detailed description of this functional asymmetry is missing, and the underlying models are widely debated. We analyzed cortical responses from 96 epilepsy patients with electrode implantation in left or right primary, secondary, and/or association auditory cortex (AAC).
View Article and Find Full Text PDFDoes brain asymmetry for speech and music emerge from acoustical cues or from domain-specific neural networks? We selectively filtered temporal or spectral modulations in sung speech stimuli for which verbal and melodic content was crossed and balanced. Perception of speech decreased only with degradation of temporal information, whereas perception of melodies decreased only with spectral degradation. Functional magnetic resonance imaging data showed that the neural decoding of speech and melodies depends on activity patterns in left and right auditory regions, respectively.
View Article and Find Full Text PDFThat attention is a fundamentally rhythmic process has recently received abundant empirical evidence. The essence of temporal attention, however, is to flexibly focus in time. Whether this function is constrained by an underlying rhythmic neural mechanism is unknown.
View Article and Find Full Text PDFWhen listening to temporally regular rhythms, most people are able to extract the beat. Evidence suggests that the neural mechanism underlying this ability is the phase alignment of endogenous oscillations to the external stimulus, allowing for the prediction of upcoming events (i.e.
View Article and Find Full Text PDFIn the motor cortex, beta oscillations (∼12-30 Hz) are generally considered a principal rhythm contributing to movement planning and execution. Beta oscillations cohabit and dynamically interact with slow delta oscillations (0.5-4 Hz), but the role of delta oscillations and the subordinate relationship between these rhythms in the perception-action loop remains unclear.
View Article and Find Full Text PDFThe ability to predict when something will happen facilitates sensory processing and the ensuing computations. Building on the observation that neural activity entrains to periodic stimulation, leading neurophysiological models imply that temporal predictions rely on oscillatory entrainment. Although they provide a sufficient solution to predict periodic regularities, these models are challenged by a series of findings that question their suitability to account for temporal predictions based on aperiodic regularities.
View Article and Find Full Text PDFAnticipating the future rests upon our ability to exploit contextual cues and to formulate valid internal models or predictions. It is currently unknown how multiple predictions combine to bias perceptual information processing, and in particular whether this is determined by physiological constraints, behavioral relevance (task demands), or past knowledge (perceptual expertise). In a series of behavioral auditory experiments involving musical experts and non-musicians, we investigated the respective and combined contribution of temporal and spectral predictions in multiple detection tasks.
View Article and Find Full Text PDFIn behavior, action and perception are inherently interdependent. However, the actual mechanistic contributions of the motor system to sensory processing are unknown. We present neurophysiological evidence that the motor system is involved in predictive timing, a brain function that aligns temporal fluctuations of attention with the timing of events in a task-relevant stream, thus facilitating sensory selection and optimizing behavior.
View Article and Find Full Text PDFPredicting not only what will happen, but also when it will happen is extremely helpful for optimizing perception and action. Temporal predictions driven by periodic stimulation increase perceptual sensitivity and reduce response latencies. At the neurophysiological level, a single mechanism has been proposed to mediate this twofold behavioral improvement: the rhythmic entrainment of slow cortical oscillations to the stimulation rate.
View Article and Find Full Text PDFNeuronal oscillations are comprised of rhythmic fluctuations of excitability that are synchronized in ensembles of neurons and thus function as temporal filters that dynamically organize sensory processing. When perception relies on anticipatory mechanisms, ongoing oscillations also provide a neurophysiological substrate for temporal prediction. In this article, we review evidence for this account with a focus on auditory perception.
View Article and Find Full Text PDF