Conclusion: Whilst objective testing on music perception showed no individual differences between cochlear implant (CI) devices, subjective music perception was found to be superior with the MED-EL device in the majority of cases evaluated.
Objective: To compare speech and music perception through two different CI systems in the same individuals.
Methods: Six post-lingually deaf patients, who had been implanted with a Cochlear™ Nucleus(®) device in one ear and a MED-EL SONATATI(100) on the contralateral side were evaluated. One subject was excluded from group analysis because of significant differences in performance between ears. Subjects completed a questionnaire designed to assess implant users' listening habits. Subjective assessments of each subject were made for comparison of speech and music perception with each system and preferences of system. The subjects consecutively used each system with the contralateral device turned off, and were objectively assessed for specific musical skills. Speech perception in quiet and in noise was tested.
Results: For all objective tests of music discrimination and speech perception in noise, there were no statistically significant differences between MED-EL and Cochlear CI systems. Subjectively, four subjects thought their MED-EL device was better than their Cochlear device for music appreciation. Four thought that music sounded more natural, less tinny and more reverberant with their MED-EL CI than with their Cochlear CI. One subject rated all these to be equal.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.3109/00016489.2011.616225 | DOI Listing |
Front Psychol
January 2025
Department of Behavioral and Cognitive Biology, Vienna CogSciHub, University of Vienna, Vienna, Austria.
Musical melodies and rhythms are typically perceived in a relative manner: two melodies are considered "the same" even if one is shifted up or down in frequency, as long as the relationships among the notes are preserved. Similar principles apply to rhythms, which can be slowed down or sped up proportionally in time and still be considered the same pattern. We investigated whether humans perceiving rhythms and melodies may rely upon the same or similar mechanisms to achieve this relative perception.
View Article and Find Full Text PDFPain
January 2025
Department of Psychology, McGill University, Montreal, Canada.
Music has long been recognized as a noninvasive and cost-effective means of reducing pain. However, the selection of music for pain relief often relies on intuition rather than on a scientific understanding of the impact of basic musical attributes on pain perception. This study examines how a fundamental element of music-tempo-affects its pain-relieving properties.
View Article and Find Full Text PDFJASA Express Lett
January 2025
Department of Otolaryngology-Head and Neck Surgery, University of Washington, Seattle, Washington 98103, USA.
Pitch perception affects children's ability to perceive speech, appreciate music, and learn in noisy environments, such as their classrooms. Here, we investigated pitch perception for pure tones as well as resolved and unresolved complex tones with a fundamental frequency of 400 Hz in 8- to 11-year-old children and adults. Pitch perception in children was better for resolved relative to unresolved complex tones, consistent with adults.
View Article and Find Full Text PDFSci Rep
January 2025
RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Forskningsveien 3A, Oslo, 0373, Norway.
Periodic sensory inputs entrain oscillatory brain activity, reflecting a neural mechanism that might be fundamental to temporal prediction and perception. Most environmental rhythms and patterns in human behavior, such as walking, dancing, and speech do not, however, display strict isochrony but are instead quasi-periodic. Research has shown that neural tracking of speech is driven by modulations of the amplitude envelope, especially via sharp acoustic edges, which serve as prominent temporal landmarks.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Otorhinolaryngology, University of Pennsylvania, Philadelphia, PA, USA.
Auditory perception requires categorizing sound sequences, such as speech or music, into classes, such as syllables or notes. Auditory categorization depends not only on the acoustic waveform, but also on variability and uncertainty in how the listener perceives the sound - including sensory and stimulus uncertainty, the listener's estimated relevance of the particular sound to the task, and their ability to learn the past statistics of the acoustic environment. Whereas these factors have been studied in isolation, whether and how these factors interact to shape categorization remains unknown.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!