Publications by authors named "Jordi Navarra"

Article Synopsis
  • * The study found that IwPD, despite having actual voice issues like low loudness (hypophonia), still scored within a healthy range on the VoiSS, indicating a disconnect between self-perception and vocal ability.
  • * Results suggest a need for new self-report tools specifically designed for IwPD to accurately capture their unique vocal challenges, as current questionnaires do not effectively recognize their specific voice characteristics.
View Article and Find Full Text PDF
Article Synopsis
  • * Not all patients experience these sensory changes similarly; variations exist in how they perceive and cope with the side effects of chemotherapy.
  • * The review calls for more research on these individual differences and suggests potential strategies to enhance patients' chemosensory perception of food, aiming for a personalized approach in addressing malnutrition during and after treatment.
View Article and Find Full Text PDF
Article Synopsis
  • The brain can adjust how we perceive the timing of sensory signals based on their associations with one another.
  • In an experiment, participants were exposed to a circle that was always followed by a tone and a triangle that appeared at random times.
  • Results showed that after 5 minutes, participants were better at judging the timing of the circle and tone together, but not for the triangle, suggesting that learned associations influence our perception of time even when we're not consciously aware of them.
View Article and Find Full Text PDF
Article Synopsis
  • Current neuropsychological assessments miss subtle motor deficits in cognitively normal individuals who have amyloid-β positivity, indicating a need for improved measurements.
  • The study involved 72 right-handed participants categorized by their cerebrospinal fluid (CSF) biomarker profiles, including controls and those with amyloid-β positivity, using a modified Finger Tapping Task to evaluate tapping speed and variability.
  • Results showed significant differences in tapping speed and variability between groups, with those positive for amyloid-β displaying slower speeds and more variability, suggesting that these motor difficulties might be early indicators of dementia risk related to Alzheimer's disease.
View Article and Find Full Text PDF
Article Synopsis
  • Musical melodies have distinct "peaks" and "valleys," which are related to pitch but are not fully understood in terms of mental representation.
  • The study explored how past experiences with melodies influence crossmodal interactions, affecting perception and attention.
  • Results showed that congruent visual stimuli (aligned with melody predictions) led to quicker responses, while incongruent stimuli (which violated predictions) caused a stronger 'surprise' response, indicating that repeated exposure to melodies shapes how we interpret other sensory information.
View Article and Find Full Text PDF
Article Synopsis
  • Higher frequency and louder sounds are connected to higher spatial positions, while lower frequency and quieter sounds relate to lower positions, with different language terms used for these associations in English versus Catalan/Spanish.
  • English uses "high" and "low" for both pitch and loudness, while Catalan/Spanish have distinct words for pitch and loudness.
  • A study assessing how language affects sound perception showed that both English and Spanish/Catalan speakers recognized these sound associations, but English speakers were more influenced by pitch, indicating linguistic background impacts perception.
View Article and Find Full Text PDF
Article Synopsis
  • Phenylketonuria (PKU) is a rare metabolic disease that leads to neurological symptoms and slow performance in tasks requiring motor coordination, which is linked to phenylalanine (Phe) levels.
  • A study compared the visuomotor task performance of early-treated PKU patients aged 11 to 25 with healthy controls, measuring response speed and practice effects related to movement.
  • Results showed PKU patients had slower responses and did not benefit from practice, with correlations indicating that higher Phe levels and performance on executive function tasks may contribute to diminished motor control and learning capabilities.
View Article and Find Full Text PDF
Article Synopsis
  • Individuals with preclinical Alzheimer's disease (Pre-AD) may show subtle cognitive difficulties, even when standard tests indicate normal performance; this study aimed to detect these issues using a new visuomotor coordination task (VMC).
  • The VMC task revealed that Pre-AD participants had slower response times compared to cognitively normal controls, indicating early visuomotor difficulties, which were also related to Alzheimer's biomarkers and subjective cognitive decline.
  • The findings suggest that the VMC task could serve as an effective tool for distinguishing Pre-AD individuals from healthy controls and may highlight visuomotor dysfunction as an early indicator of Alzheimer's development.
View Article and Find Full Text PDF
Article Synopsis
  • - The study explored cognitive deficits in patients with schizophrenia, focusing on their ability to filter out irrelevant visual information and its relation to positive and negative symptoms.
  • - Two experiments assessed how well participants classified visual stimuli by ignoring specific dimensions; results showed that individuals with schizophrenia had slower reactions when faced with irrelevant variations compared to healthy controls.
  • - The findings indicate that attention deficits in filtering visual information in schizophrenia may be linked to positive symptoms, although the study's small sample size poses a limitation.
View Article and Find Full Text PDF
Article Synopsis
  • Researchers studied how 4- and 6-month-old infants connect pitch with the size of objects.
  • The results showed that only the 6-month-olds demonstrated a clear connection between these two senses.
  • This indicates that older infants have more experience or developmental maturity, which helps them make these crossmodal associations better than younger infants.
View Article and Find Full Text PDF
Article Synopsis
  • The study explored how well people can distinguish between languages based on rhythmic patterns, specifically looking at English and Japanese.
  • It tested various sensory modes: auditory, visual, both together, and tactile (touch), finding that people could discriminate languages effectively using these methods, especially auditory cues.
  • The final findings suggest that it's possible to identify speech rhythms through visual and tactile means, but auditory discrimination remains the most effective.
View Article and Find Full Text PDF
Article Synopsis
  • The study explores how the brain aligns visual and auditory signals that are not perfectly synchronized, specifically focusing on temporal realignment when such signals have a 706 ms delay.
  • Participants were exposed to these stimuli in different spatial arrangements and took a simultaneity judgment task afterward, revealing that temporal realignment occurs when visual stimuli lead the auditory stimuli.
  • The findings indicate that even with differing spatial positions and asynchrony, the brain can recalibrate sensory inputs, influenced by common experiences where vision typically precedes sound in real-world settings.
View Article and Find Full Text PDF
Article Synopsis
  • - Adults can distinguish between languages using only visual cues from speech, similar to infants.
  • - Those who learned English as a first or second language before age 6 were significantly better at distinguishing between French and English face expressions than those who learned after age 6.
  • - The study suggests early exposure to language plays a vital role in processing visual speech cues, which might explain why adults struggle to learn new languages later in life.
View Article and Find Full Text PDF
Article Synopsis
  • The study examined how 6-month-old infants react to short-term exposure to asynchronous audiovisual stimuli.
  • Unlike adults, who typically adjust their perception over time, infants showed heightened sensitivity to synchronized sounds and visuals after experiencing asynchrony.
  • This suggests that infants process audiovisual information differently than adults, improving their ability to detect synchrony rather than recalibrating their perception.
View Article and Find Full Text PDF
Article Synopsis
  • * In a study, participants were exposed to either synchronized audiovisual stimuli or a 220-ms delay for 3 minutes to see if this adaptation affected their perception of sounds at different frequencies.
  • * Results showed that participants' perception of audiovisual timing improved regardless of the sound frequency, indicating that this adaptation involves broader brain areas not limited by specific sound features like frequency.
View Article and Find Full Text PDF
Article Synopsis
  • - The study investigates how prior experience with audiovisual stimuli affects our ability to perceive simultaneity in language, specifically comparing English and Spanish speakers.
  • - Results showed that when listening to sentences in their native language, participants needed to see visual speech precede the audio by a longer time compared to when listening in a non-native language.
  • - As participants gain more experience with a non-native language, the differences in how they process visual and auditory signals tend to lessen, highlighting how visual information influences our perception of timing in audiovisual communication.
View Article and Find Full Text PDF
Article Synopsis
  • - The brain adjusts its perception of time between visual and auditory signals when they are out of sync, leading to a reduction in the perceived lag between them.
  • - While it's thought that the auditory system processes timing more accurately, this study demonstrates that visual and auditory adaptations can influence reaction times (RTs) to sound stimuli differently after exposure to asynchronous signals.
  • - Participants displayed faster or slower RTs to sounds depending on whether they experienced auditory-lagging or auditory-leading asynchrony, indicating that prolonged exposure to audiovisual misalignment alters how we respond to auditory information.
View Article and Find Full Text PDF
Article Synopsis
  • Exposure to asynchronous audiovisual speech can change how we perceive timing in simpler auditory and visual stimuli, as evidenced by changes in temporal order judgment (TOJ) and simultaneity judgment (SJ) tasks.
  • In an experiment, participants judged the timing of a light flash and a noise while monitoring a speech stream that was either synchronized or delayed, affecting their timing sensitivity.
  • The study found that desynchronized speech significantly altered responses in the SJ task, but not in the TOJ task, suggesting these tasks measure different aspects of temporal perception.
View Article and Find Full Text PDF
Article Synopsis
  • One classic example of multisensory integration is how humans combine speech sounds with corresponding visual gestures, like lip movements.
  • Recent research indicates that this integration is not entirely automatic and can be affected by how much attention we have available, particularly when our focus is diverted.
  • In a study testing this, participants experienced a reduced ability to integrate visual and auditory speech information when also engaged in a challenging tactile task, suggesting that attention constraints influence how different senses work together.
View Article and Find Full Text PDF
Article Synopsis
  • The study aimed to investigate how well individuals can differentiate between languages based on visual cues from silent video clips of speech, focusing on Spanish and Catalan.
  • Bilingual Spanish-Catalan speakers successfully identified language differences using visual information alone, while speakers of Italian and English without familiarity with the languages struggled to do so.
  • The ability to discriminate was also observed in Spanish monolinguals, suggesting that even limited knowledge of one language aids in recognition, although bilinguals performed better overall; the findings emphasize the richness of visual speech cues beyond traditional theories.
View Article and Find Full Text PDF
Article Synopsis
  • The study found that infants as young as 4 and 6 months can distinguish between English and French just by watching silent mouth movements.
  • By 8 months, only those infants exposed to both languages (bilingual) can still perform this discrimination.
  • These results suggest that infants have an early ability to visually recognize languages and that they gradually become more selective in their perceptual skills based on their language exposure.
View Article and Find Full Text PDF
Article Synopsis
  • The study examined how listening to an asynchronous speech stream influences people's ability to perceive the order of sounds and visuals in VCV (vowel-consonant-vowel) speech video clips.
  • Participants judged whether the sound or visual gesture occurred first while some were also exposed to an additional asynchronous word stream.
  • Results showed that focusing on the asynchronous speech caused a noticeable shift in participants' judgments, suggesting that adapting to delayed audio can alter the perception of more intricate audiovisual speech.
View Article and Find Full Text PDF
Article Synopsis
  • Previous research found that our brains can adjust for slight timing differences between sounds and visuals.
  • This study experimented to see if the brain could do the same for sounds and touch, where participants were exposed to sounds and tactile signals either in sync or with a slight delay.
  • Results indicated that participants found it harder to judge the order of sounds and touches after being exposed to the delayed stimuli, suggesting that our brains have a mechanism to adjust for timing differences in combined sensory input.
View Article and Find Full Text PDF
Article Synopsis
  • Researchers explored how visual speech cues (like lip movements) affect the ability to perceive sounds in a second language (L2).
  • They found that Spanish-dominant bilinguals struggled to distinguish between certain Catalan sounds when only auditory information was provided, while Catalan-dominant bilinguals did not have this issue.
  • However, when visual and auditory stimuli were combined, all participants improved their ability to discern the sounds, indicating that visual cues can significantly enhance L2 sound perception through multisensory integration.
View Article and Find Full Text PDF
Article Synopsis
  • * The current study uses an implicit approach to examine L2 phoneme discrimination among early bilinguals who speak Catalan and Spanish.
  • * Results show that Catalan-dominant bilinguals were slower in processing sounds when the second syllable had variable sounds, indicating they categorize L2 sounds based on their L1, unlike Spanish-dominant participants.
View Article and Find Full Text PDF