Emotionally congruent music and text increase immersion and appraisal.

PLoS One

Department of General Experimental Psychology, Johannes Gutenberg-Universität Mainz, Mainz, Germany.

Published: January 2023

Numerous studies indicate that listening to music and reading are processes that interact in multiple ways. However, these interactions have rarely been explored with regard to the role of emotional mood. In this study, we first conducted two pilot experiments to assess the conveyed emotional mood of four classical music pieces and that of four narrative text excerpts. In the main experiment, participants were asked to read the texts while listening to the music and to rate their emotional state in terms of valence, arousal, and dominance. Subsequently, they rated text and music of the multisensory event in terms of the perceived mood, liking, immersion, and music-text fit. We found a mutual carry-over effect of happy and sad moods from music to text and vice versa. Against our expectations, this effect was not mediated by the valence, arousal, or dominance experienced by the subject. Moreover, we revealed a significant interaction between music mood and text mood. Texts were liked better, they were classified as of better quality, and participants felt more immersed in the text if text mood and music mood corresponded. The role of mood congruence when listening to music while reading should not be ignored and deserves further exploration.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9836297PMC
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0280019PLOS

Publication Analysis

Top Keywords

listening music
12
music
9
music text
8
music reading
8
mood
8
emotional mood
8
valence arousal
8
arousal dominance
8
music mood
8
text mood
8

Similar Publications

Evidence for a shared cognitive mechanism underlying relative rhythmic and melodic perception.

Front Psychol

January 2025

Department of Behavioral and Cognitive Biology, Vienna CogSciHub, University of Vienna, Vienna, Austria.

Musical melodies and rhythms are typically perceived in a relative manner: two melodies are considered "the same" even if one is shifted up or down in frequency, as long as the relationships among the notes are preserved. Similar principles apply to rhythms, which can be slowed down or sped up proportionally in time and still be considered the same pattern. We investigated whether humans perceiving rhythms and melodies may rely upon the same or similar mechanisms to achieve this relative perception.

View Article and Find Full Text PDF

Pitch perception in school-aged children: Pure tones, resolved and unresolved harmonics.

JASA Express Lett

January 2025

Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington 98103, USA.

Pitch perception affects children's ability to perceive speech, appreciate music, and learn in noisy environments, such as their classrooms. Here, we investigated pitch perception for pure tones as well as resolved and unresolved complex tones with a fundamental frequency of 400 Hz in 8- to 11-year-old children and adults. Pitch perception in children was better for resolved relative to unresolved complex tones, consistent with adults.

View Article and Find Full Text PDF

Background: Binaural beat is created by presenting two different pure-tone sine waves with less than a 30Hz difference dichotically. In dental settings, children listening to familiar music during treatments gain control over the anxiety caused by tools like the airotor or syringe, creating a comforting, familiar environment.

Aim: To evaluate and compare anxiety level during restorative treatment using No Music, Music of choice and Binaural Auditory Beats as Audio distraction behaviour guidance technique in children aged 6-12 years.

View Article and Find Full Text PDF

Beta oscillations predict the envelope sharpness in a rhythmic beat sequence.

Sci Rep

January 2025

RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Forskningsveien 3A, Oslo, 0373, Norway.

Periodic sensory inputs entrain oscillatory brain activity, reflecting a neural mechanism that might be fundamental to temporal prediction and perception. Most environmental rhythms and patterns in human behavior, such as walking, dancing, and speech do not, however, display strict isochrony but are instead quasi-periodic. Research has shown that neural tracking of speech is driven by modulations of the amplitude envelope, especially via sharp acoustic edges, which serve as prominent temporal landmarks.

View Article and Find Full Text PDF

Auditory perception requires categorizing sound sequences, such as speech or music, into classes, such as syllables or notes. Auditory categorization depends not only on the acoustic waveform, but also on variability and uncertainty in how the listener perceives the sound - including sensory and stimulus uncertainty, the listener's estimated relevance of the particular sound to the task, and their ability to learn the past statistics of the acoustic environment. Whereas these factors have been studied in isolation, whether and how these factors interact to shape categorization remains unknown.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!