Perception with electric neuroprostheses is sometimes expected to be simulated using properly designed physical stimuli. Here, we examined a new acoustic vocoder model for electric hearing with cochlear implants (CIs) and hypothesized that comparable speech encoding can lead to comparable perceptual patterns for CI and normal hearing (NH) listeners. Speech signals were encoded using FFT-based signal processing stages including band-pass filtering, temporal envelope extraction, maxima selection, and amplitude compression and quantization. These stages were specifically implemented in the same manner by an Advanced Combination Encoder (ACE) strategy in CI processors and Gaussian-enveloped Tones (GET) or Noise (GEN) vocoders for NH. Adaptive speech reception thresholds (SRTs) in noise were measured using four Mandarin sentence corpora. Initial consonant (11 monosyllables) and final vowel (20 monosyllables) recognition were also measured. NaÏve NH listeners were tested using vocoded speech with the proposed GET/GEN vocoders as well as conventional vocoders (controls). Experienced CI listeners were tested using their daily-used processors. Results showed that: 1) there was a significant training effect on GET vocoded speech perception; 2) the GEN vocoded scores (SRTs with four corpora and consonant and vowel recognition scores) as well as the phoneme-level confusion pattern matched with the CI scores better than controls. The findings suggest that the same signal encoding implementations may lead to similar perceptual patterns simultaneously in multiple perception tasks. This study highlights the importance of faithfully replicating all signal processing stages in the modeling of perceptual patterns in sensory neuroprostheses. This approach has the potential to enhance our understanding of CI perception and accelerate the engineering of prosthetic interventions. The GET/GEN MATLAB program is freely available athttps://github.com/BetterCI/GETVocoder.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNSRE.2023.3274604 | DOI Listing |
Proc Natl Acad Sci U S A
January 2025
Section on Perception, Cognition, Action, Laboratory of Sensorimotor Research, National Eye Institute, NIH, Bethesda, MD 20892.
To what extent does concept formation require language? Here, we exploit color to address this question and ask whether macaque monkeys have color concepts evident as categories. Macaques have similar cone photoreceptors and central visual circuits to humans, yet they lack language. Whether Old World monkeys such as macaques have consensus color categories is unresolved, but if they do, then language cannot be required.
View Article and Find Full Text PDFPLoS One
January 2025
Department of Preventive Medicine, School of Public Health, Addis Ababa University, Addis Ababa, Ethiopia.
Background: Despite the rising prevalence of common mental symptoms, information is scarce on how health workers make sense of symptoms of mental disorders and perceive a link with inadequate water, sanitation, and hygiene (WASH) as work stressors to understand causation and produce useful knowledge for policy and professionals. Therefore, this study aimed to explore how health workers perceive the link between inadequate WASH and common mental symptoms (CMSs) at hospitals in central and southern Ethiopian regions.
Methods: We used an interpretive and descriptive phenomenological design guided by theoretical frameworks.
J Neuropsychiatry Clin Neurosci
January 2025
Department of Psychology, Chung Shan Medical University, and Clinical Psychological Room, Chung Shan Medical University Hospital, Taichung, Taiwan (Huang); Department of Psychology, Fo Guang University, Yilan, Taiwan (Chen); Come a New Halfway House, Taoyuan, Taiwan (Wang); Department of Psychiatry, National Cheng Kung University Hospital (Kuo, Yang, Tseng), and Institute of Behavioral Medicine (Yang, Tseng), College of Medicine, National Cheng Kung University, Tainan, Taiwan.
Objective: Social cognition is defined as the ability to construct mental representations about oneself, others, and one's relationships with others to guide social behaviors, including referring to mental states (cognitive factor) and understanding emotional states (affective factor). Difficulties in social cognition may be symptoms of schizophrenia. The authors examined associations between two factors of social cognition and specific schizophrenia symptoms, as well as a potential path from low-level affective perceptual social cognition to high-level social cognition, which may be associated with schizophrenia symptoms.
View Article and Find Full Text PDFJ Neurosci
January 2025
Department of Physiology, Anatomy and Genetics, University of Oxford.
Limits on information processing capacity impose limits on task performance. We show that male and female mice achieve performance on a perceptual decision task that is near-optimal given their capacity limits, as measured by policy complexity (the mutual information between states and actions). This behavioral profile could be achieved by reinforcement learning with a penalty on high complexity policies, realized through modulation of dopaminergic learning signals.
View Article and Find Full Text PDFNeuropsychologia
January 2025
Department of Criminology & Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan 5290002, Israel; Department of Neuroscience and Biomedical Engineering, Aalto University, Finland 00076. Electronic address:
While decreasing negative attitudes against outgroups are often reported by individuals themselves, biased behaviour prevails. This gap between words and actions may stem from unobtrusive mental processes that could be uncovered by using neuroimaging in addition to self-reports. In this study we investigated whether adding neuroimaging to a traditional intergroup bias measure could detect intersubject differences in intergroup bias processes in a societal context where opposing discrimination is normative.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!