The aim of the present study was to investigate interactions between vision and audition during a visual target acquisition task performed in a virtual environment. In two experiments, participants were required to perform an acquisition task guided by auditory and/or visual cues. In both experiments the auditory cues were constructed using virtual 3-D sound techniques based on nonindividualized head-related transfer functions. In Experiment 1 the visual cue was constructed in the form of a continuously updated 2-D arrow. In Experiment 2 the visual cue was a nonstereoscopic, perspective-based 3-D arrow. The results suggested that virtual spatial auditory cues reduced acquisition time but were not as effective as the virtual visual cues. Experiencing the 3-D perspective-based arrow rather than the 2-D arrow produced a faster acquisition time not only in the visually aided conditions but also when the auditory cues were presented in isolation. Suggested novel applications include providing 3-D nonstereoscopic, perspective-based visual information on radar displays, which may lead to a better integration with spatial virtual auditory information.

Download full-text PDF

Source
http://dx.doi.org/10.1518/hfes.46.4.728.56815DOI Listing

Publication Analysis

Top Keywords

visual cues
12
auditory cues
12
target acquisition
8
acquisition task
8
experiment visual
8
visual cue
8
2-d arrow
8
nonstereoscopic perspective-based
8
acquisition time
8
visual
7

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!