Advanced driver assistance systems (ADAS) can enhance road safety by sending warning signals to drivers. Multimodal signals are gaining attention in ADAS warning design because they offer redundant information that facilitates human-system communication. However, no consensus has been reached on which multimodal design offers optimal benefit to road safety. Icons iconically map the real world and are associated with fast recognition and response time. Therefore, this study aims to investigate whether visual and auditory icons will benefit the effectiveness of audiovisual multimodal warnings. Thirty-two participants (16 females) experienced four types of unimodal warnings (high and low mapping visual warnings and high and low mapping auditory warnings) and four types of audiovisual warnings (high mapping visual + high mapping auditory warning, low mapping visual + low mapping auditory warning, high mapping visual + low mapping auditory warning, and low mapping visual + high mapping auditory warning) in simulated driving conditions. Visual warnings are presented in a head-up display. Results showed that multimodal warnings outperformed unimodal warnings (i.e., modality effect). We found mapping effect in audiovisual warnings, but only high mapping auditory constituents benefited warning effectiveness. Eye movement results revealed that the high mapping constituents might distract drivers from the road. This study adds evidence that multimodal warnings can offer extra benefits to drivers and high mapping auditory signals should be included in multimodal warning design to achieve better driving performance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.apergo.2021.103638 | DOI Listing |
Brain Struct Funct
January 2025
Department of Psychiatry, Psychotherapy and Psychosomatics, School of Medicine, RWTH Aachen University, Aachen, Germany.
Physiological responses derived from audiovisual perception during assisted driving are associated with the regulation of the autonomic nervous system (ANS), especially in emergencies. However, the interaction of event-related brain activity and the ANS regulating peripheral physiological indicators (i.e.
View Article and Find Full Text PDFSci Rep
January 2025
Department of Experimental Psychology, Ghent University, Ghent, Belgium.
How are arbitrary sequences of verbal information retained and manipulated in working memory? Increasing evidence suggests that serial order in verbal WM is spatially coded and that spatial attention is involved in access and retrieval. Based on the idea that brain areas controlling spatial attention are also involved in oculomotor control, we used eye tracking to reveal how the spatial structure of serial order information is accessed in verbal working memory. In two experiments, participants memorized a sequence of auditory words in the correct order.
View Article and Find Full Text PDFCommun Biol
January 2025
School of Psychology, Shenzhen University, Shenzhen, China.
Speech processing involves a complex interplay between sensory and motor systems in the brain, essential for early language development. Recent studies have extended this sensory-motor interaction to visual word processing, emphasizing the connection between reading and handwriting during literacy acquisition. Here we show how language-motor areas encode motoric and sensory features of language stimuli during auditory and visual perception, using functional magnetic resonance imaging (fMRI) combined with representational similarity analysis.
View Article and Find Full Text PDFElife
January 2025
Department of Psychology, Queens University, Kingston, Canada.
Movie-watching is a central aspect of our lives and an important paradigm for understanding the brain mechanisms behind cognition as it occurs in daily life. Contemporary views of ongoing thought argue that the ability to make sense of events in the 'here and now' depend on the neural processing of incoming sensory information by auditory and visual cortex, which are kept in check by systems in association cortex. However, we currently lack an understanding of how patterns of ongoing thoughts map onto the different brain systems when we watch a film, partly because methods of sampling experience disrupt the dynamics of brain activity and the experience of movie-watching.
View Article and Find Full Text PDFBrain
January 2025
Department of Neurology, Medical College of Wisconsin, Milwaukee, WI 53226, USA.
Acoustic-phonetic perception refers to the ability to perceive and discriminate between speech sounds. Acquired impairment of acoustic-phonetic perception is known historically as "pure word deafness" and typically follows bilateral lesions of the cortical auditory system. The extent to which this deficit occurs after unilateral left hemisphere damage and the critical left hemisphere areas involved are not well defined.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!