Purpose: Protoporphyrin (PpIX) fluorescence allows discrimination of tumor and normal brain tissue during neurosurgery. A handheld fluorescence (HHF) probe can be used for spectroscopic measurement of 5-ALA-induced PpIX to enable objective detection compared to visual evaluation of fluorescence. However, current technology requires that the surgeon either views the measured values on a screen or employs an assistant to verbally relay the values. An auditory feedback system was developed and evaluated for communicating measured fluorescence intensity values directly to the surgeon.
Methods: The auditory display was programmed to map the values measured by the HHF probe to the playback of tones that represented three fluorescence intensity ranges and one error signal. Ten persons with no previous knowledge of the application took part in a laboratory evaluation. After a brief training period, participants performed measurements on a tray of 96 wells of liquid fluorescence phantom and verbally stated the perceived measurement values for each well. The latency and accuracy of the participants' verbal responses were recorded. The long-term memorization of sound function was evaluated in a second set of 10 participants 2-3 and 7-12 days after training.
Results: The participants identified the played tone accurately for 98% of measurements after training. The median response time to verbally identify the played tones was 2 pulses. No correlation was found between the latency and accuracy of the responses, and no significant correlation with the musical proficiency of the participants was observed on the function responses. Responses for the memory test were 100% accurate.
Conclusion: The employed auditory display was shown to be intuitive, easy to learn and remember, fast to recognize, and accurate in providing users with measurements of fluorescence intensity or error signal. The results of this work establish a basis for implementing and further evaluating auditory displays in clinical scenarios involving fluorescence guidance and other areas for which categorized auditory display could be useful.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5772873 | PMC |
http://dx.doi.org/10.1007/s11548-017-1667-5 | DOI Listing |
Sci Rep
January 2025
RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, University of Oslo, Forskningsveien 3A, Oslo, 0373, Norway.
Periodic sensory inputs entrain oscillatory brain activity, reflecting a neural mechanism that might be fundamental to temporal prediction and perception. Most environmental rhythms and patterns in human behavior, such as walking, dancing, and speech do not, however, display strict isochrony but are instead quasi-periodic. Research has shown that neural tracking of speech is driven by modulations of the amplitude envelope, especially via sharp acoustic edges, which serve as prominent temporal landmarks.
View Article and Find Full Text PDFFront Oncol
January 2025
The Second Clinical Medicine College, Jinan University, Shenzhen, China.
Introduction: Endolymphatic sac tumor (ELST) is a rare neoplasm that exhibits aggressive growth primarily in the endolymphatic capsule and can potentially affect nearby neurovascular structures. The diagnosis of ELST poses challenges due to its low prevalence, gradual progression, and nonspecific symptomatology. It is currently believed that prompt surgical intervention is recommended for endolymphatic sac tumors upon diagnosis.
View Article and Find Full Text PDFAdv Sci (Weinh)
January 2025
Department of Otolaryngology, Case Western Reserve University, Cleveland, OH, 44106, USA.
Usher syndrome type 1C (USH1C) is a genetic disorder caused by mutations in the USH1C gene, which encodes harmonin, a key component of the mechanoelectrical transduction complex in auditory and vestibular hair cells. USH1C leads to deafness and vestibular dysfunction in humans. An Ush1c knockout (KO) mouse model displaying these characteristic deficits is generated in our laboratory.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Neuroscience, University of Pennsylvania, Philadelphia, PA, USA.
Categorization is an essential task for sensory perception. Individuals learn category labels using a variety of strategies to ensure that sensory signals, such as sounds or images, can be assigned to proper categories. Categories are often learned on the basis of extreme examples, and the boundary between categories can differ among individuals.
View Article and Find Full Text PDFJ Neuroeng Rehabil
January 2025
Dept. of Cognitive Robotics, TU Delft, Delft, Netherlands.
Background: Head-mounted displays can be used to offer personalized immersive virtual reality (IVR) training for patients who have suffered an Acquired Brain Injury (ABI) by tailoring the complexity of visual and auditory stimuli to the patient's cognitive capabilities. However, it is still an open question how these virtual environments should be designed.
Methods: We used a human-centered design approach to help define the characteristics of suitable virtual training environments for ABI patients.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!