AI Article Synopsis

  • Neuroimaging studies show that the auditory cortex enhances tracking of an attended speaker in noisy environments, similar to speech comprehension.
  • In music listening, two modes—segregation and integration—allow listeners to focus on individual instruments or appreciate them simultaneously.
  • Researchers found that relevant instruments were better represented during a segregation task compared to integration, suggesting different processing strategies at play in understanding multi-instrument music.

Article Abstract

Numerous neuroimaging studies demonstrated that the auditory cortex tracks ongoing speech and that, in multi-speaker environments, tracking of the attended speaker is enhanced compared to the other irrelevant speakers. In contrast to speech, multi-instrument music can be appreciated by attending not only on its individual entities (i.e., segregation) but also on multiple instruments simultaneously (i.e., integration). We investigated the neural correlates of these two modes of music listening using electroencephalography (EEG) and sound envelope tracking. To this end, we presented uniquely composed music pieces played by two instruments, a bassoon and a cello, in combination with a previously validated music auditory scene analysis behavioral paradigm (Disbergen et al., 2018). Similar to results obtained through selective listening tasks for speech, relevant instruments could be reconstructed better than irrelevant ones during the segregation task. A delay-specific analysis showed higher reconstruction for the relevant instrument during a middle-latency window for both the bassoon and cello and during a late window for the bassoon. During the integration task, we did not observe significant attentional modulation when reconstructing the overall music envelope. Subsequent analyses indicated that this null result might be due to the heterogeneous strategies listeners employ during the integration task. Overall, our results suggest that subsequent to a common processing stage, top-down modulations consistently enhance the relevant instrument's representation during an instrument segregation task, whereas such an enhancement is not observed during an instrument integration task. These findings extend previous results from speech tracking to the tracking of multi-instrument music and, furthermore, inform current theories on polyphonic music perception.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8498193PMC
http://dx.doi.org/10.3389/fnins.2021.635937DOI Listing

Publication Analysis

Top Keywords

integration task
12
music
8
polyphonic music
8
multi-instrument music
8
bassoon cello
8
segregation task
8
window bassoon
8
integration
5
task
5
modulating cortical
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!