Schöner and Thelen (2006) summarized the results of many habituation studies as a set of generalizations about the emergence of novelty preferences in infancy. One is that novelty preferences emerge after fewer trials for older than for younger infants. Yet in habituation studies using an infant-controlled procedure, the standard criterion of habituation is a 50% decrement in looking regardless of he ages of the participants. If younger infants require more looking to habituate than do older infants, it might follow that novelty preferences will emerge for younger infants when a more stringent criterion is imposed, e.g., a 70% decrement in looking. Our earlier investigation of infants' discrimination of musical excerpts provides a basis and an opportunity for assessing this idea. Flom et al. (2008) found that 9-month-olds, but not younger infants, unambiguously discriminate "happy" and "sad" musical excerpts. The purpose of the current study was to examine younger infants' discrimination of happy and sad musical excerpts using a more stringent, 70% habituation criterion. In Experiment 1, 5- and 7-month olds were habituated to three musical excerpts rated as happy or sad. Following habituation infants were presented with two musical excerpts from the other affect group. Infants at both ages showed significant discrimination. In Experiment 2, 5- and 7-month-olds were presented with two new excerpts from the same affective group as the habituation excerpts. The infants did not discriminate these novel, yet affectively similar excerpts. In Experiment 3, 5- and 7-month-olds discriminated individual happy and sad excerpts. These results replicate those for the older, 9-month-olds in the previous investigation. The results are important as they demonstrate that whether infants show discrimination using an infant-controlled procedure is affected by the researchers' chosen criterion of habituation.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.infbeh.2012.07.022 | DOI Listing |
Music Sci
December 2024
Schulich School of Music, McGill University, Montreal, QC, Canada.
Timbre has been identified as a potential component in the communication of affect in music. Although its function as a carrier of perceptually useful information about sound source mechanics has been established, less is understood about whether and how it functions as a carrier of information for communicating affect in music. To investigate these issues, listeners trained in Chinese and Western musical traditions were presented with Phrases, Measures, and Notes of recorded excerpts interpreted with a variety of affective intentions by performers on instruments from the two cultures.
View Article and Find Full Text PDFSci Rep
November 2024
School of Psychological Sciences, Macquarie University, Sydney, Australia.
Humans perceive a range of basic emotional connotations from music, such as joy, sadness, and fear, which can be decoded from structural characteristics of music, such as rhythm, harmony, and timbre. However, despite theory and evidence that music has multiple social functions, little research has examined whether music conveys emotions specifically associated with social status and social connection. This investigation aimed to determine whether the social emotions of dominance and affiliation are perceived in music and whether structural features of music predict social emotions, just as they predict basic emotions.
View Article and Find Full Text PDFNeuroscience
December 2024
Graduate Institute of Musicology, National Taiwan University, Taipei, Taiwan; Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei, Taiwan. Electronic address:
In pop music, drum and bass components are crucial for generating the desire to move one's body, primarily due to their role in delivering salient metrical cues. This study explored how the presence of drum and bass influences neural responses to unfamiliar pop songs. Using AI-based algorithms, we isolated the drum and bass components from the musical excerpts, creating two additional versions: one that included only the drum and bass (excluding vocals and other instruments), and another that excluded the drum and bass (consisting solely of vocals and other instruments).
View Article and Find Full Text PDFAudit Percept Cogn
February 2024
Vanderbilt Brain Institute, Vanderbilt University, Nashville, TN, USA.
Introduction: The speech-to-song illusion is a robust effect where repeated speech induces the perception of singing; this effect has been extended to repeated excerpts of environmental sounds (sound-to-music effect). Here we asked whether repetition could elicit musical percepts in cochlear implant (CI) users, who experience challenges with perceiving music due to both physiological and device limitations.
Methods: Thirty adult CI users and thirty age-matched controls with normal hearing (NH) completed two repetition experiments for speech and nonspeech sounds (water droplets).
J Acoust Soc Am
October 2024
ENTPE, Ecole Centrale de Lyon, CNRS, LTDS, UMR5513, 69518 Vaulx-en-Velin, France.
Many objective measurements have been proposed to evaluate sound reproduction, but it is often difficult to link measured differences with the differences perceived by listeners. In the literature, the best correlations with perception were obtained for measures involving an auditory model. The present study investigated simpler measurements to highlight the signal processing steps required to make the link with perception.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!