Objectives: Aging populations commonly experience a decline in sensory functions, which negatively affects perceptual decision-making. The decline in sensory functions has been shown to be partially compensated by audiovisual integration. Although audiovisual integration may have a positive effect on perception, it remains unclear whether the perceptual improvements observed in older adults during perceptual decision-making are better explained by the early or late integration hypothesis.
Methods: An audiovisual categorization task was used to explore responses to unisensory and audiovisual stimuli in young and older adults. Behavioral drift diffusion model (DDM) and electroencephalography (EEG) were applied to characterize differences in cognitive and neural dynamics across groups.
Results: The DDM showed that older adults exhibited higher drift rates and shorter non-decision times for audiovisual stimuli than for visual or auditory stimuli alone. The EEG results showed that during the early sensory encoding stage (150 to 300 ms), older adults exhibited greater audiovisual integration in beta-band than younger adults. In the late decision formation stage (500 to 700 ms), older adults exhibited greater audiovisual integration in beta-band and greater audiovisual integration in the anterior frontal electrodes than younger adults.
Discussion: These findings highlight the crucial role of audiovisual integration in both the early and late stages of perceptual decision-making in older adults. The results suggest that enhanced audiovisual integration in older adults compared with younger adults may serve as a specific mechanism to mitigate the negative effects of aging on perceptual decision-making.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1093/geronb/gbaf037 | DOI Listing |
IEEE Trans Vis Comput Graph
March 2025
With the continuous advancement of artificial intelligence technology, data-driven methods for reconstructing and animating virtual agents have achieved increasing levels of realism. However, there is limited research on how these novel data-driven methods, combined with voice cues, affect user perceptions. We use advanced data-driven methods to reconstruct stylized agents and combine them with synthesized voices to study their effects on users' trust and other perceptions (e.
View Article and Find Full Text PDFChem Senses
March 2025
Department of Psychology, Stockholm University, 106 91 Stockholm, Sweden.
Working memory (WM) processes are assumed to operate on a wide variety of sensory materials, yet WM research rarely extends beyond sight and hearing. In this systematic review, we integrate research from studies that address WM in olfaction, the sense of smell, spanning the last 50 years (N=44). We assessed whether 21 proposed "benchmarks" for WM generalize to olfactory WM.
View Article and Find Full Text PDFNat Commun
March 2025
Center for Synaptic Brain Dysfunctions, IBS, Daejeon, 34141, Republic of Korea.
Decision-making in mammals fundamentally relies on integrating multiple sensory inputs, with conflicting information resolved flexibly based on a dominant sensory modality. However, the neural mechanisms underlying state-dependent changes in sensory dominance remain poorly understood. Our study demonstrates that locomotion in mice shifts auditory-dominant decisions toward visual dominance during audiovisual conflicts.
View Article and Find Full Text PDFAtten Percept Psychophys
March 2025
Division of Pediatric Dentistry, Saint Barnabas Hospital, Bronx, NY, USA.
An unintelligible video recording of a face uttering a sentence and an unintelligible acoustic sinusoid following the frequency variation of a single vocal resonance of the utterance were intelligible when presented together at their veridical synchrony. The intelligibility resulted from audiovisual sensory integration and phonetic perceptual analysis, which depended neither on the separate resolution of linguistic impressions in each modality nor on closed-set reports about a single pair of minimal phonemic contrast features. Likewise, audiovisual integration could not be attributed to Gestalt-derived similarity principles applied unimodally or bimodally.
View Article and Find Full Text PDFFront Neurosci
February 2025
Institute for Hearing Technology and Acoustics, RWTH Aachen University, Aachen, Germany.
Audiovisual cross-modal correspondences (CMCs) refer to the brain's inherent ability to subconsciously connect auditory and visual information. These correspondences reveal essential aspects of multisensory perception and influence behavioral performance, enhancing reaction times and accuracy. However, the impact of different types of CMCs-arising from statistical co-occurrences or shaped by semantic associations-on information processing and decision-making remains underexplored.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!