Auditory roughness elicits aversion, and higher activation in cerebral areas involved in threat processing, but its link with defensive behavior is unknown. Defensive behaviors are triggered by intrusions into the space immediately surrounding the body, called peripersonal space (PPS). Integrating multisensory information in PPS is crucial to assure the protection of the body.
View Article and Find Full Text PDFBackground: Humans perceive near space and far space differently. Peripersonal space (PPS), i.e.
View Article and Find Full Text PDFMultisensory integration of stimuli occurring in the area surrounding our bodies gives rise to the functional representation of peripersonal space (PPS). PPS extent is flexible according to the affective context and the target of an action, but little is known about how social context modulates it. We used an audiotactile interaction task to investigate PPS of individuals during social interaction.
View Article and Find Full Text PDFHuman listeners are able to recognize accurately an impressive range of complex sounds, such as musical instruments or voices. The underlying mechanisms are still poorly understood. Here, we aimed to characterize the processing time needed to recognize a natural sound.
View Article and Find Full Text PDFThe space immediately surrounding our bodies, i.e., peripersonal space (PPS), is a critical area for the interaction with the external world, be it to deal with imminent threat or to attain objects of interest.
View Article and Find Full Text PDFAffect, space, and multisensory integration are processes that are closely linked. However, it is unclear whether the spatial location of emotional stimuli interacts with multisensory presentation to influence the emotional experience they induce in the perceiver. In this study, we used the unique advantages of virtual reality techniques to present potentially aversive crowd stimuli embedded in a natural context and to control their display in terms of sensory and spatial presentation.
View Article and Find Full Text PDFSounds in our environment like voices, animal calls or musical instruments are easily recognized by human listeners. Understanding the key features underlying this robust sound recognition is an important question in auditory science. Here, we studied the recognition by human listeners of new classes of sounds: acoustic and auditory sketches, sounds that are severely impoverished but still recognizable.
View Article and Find Full Text PDFSPATIAL MEMORY IS MAINLY STUDIED THROUGH THE VISUAL SENSORY MODALITY: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space.
View Article and Find Full Text PDFPeri-personal space (PPS) is defined as the space immediately surrounding our bodies, which is critical in the adaptation of our social behavior. As a space of interaction with the external world, PPS is involved in the control of motor action as well as in the protection of the body. The boundaries of this PPS are known to be flexible but so far, little is known about how PPS boundaries are influenced by unreasonable fear.
View Article and Find Full Text PDFIn a natural environment, affective information is perceived via multiple senses, mostly audition and vision. However, the impact of multisensory information on affect remains relatively undiscovered. In this study, we investigated whether the auditory-visual presentation of aversive stimuli influences the experience of fear.
View Article and Find Full Text PDFStudies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them.
View Article and Find Full Text PDFCyberpsychol Behav Soc Netw
February 2013
Traditionally, virtual reality (VR) exposure-based treatment concentrates primarily on the presentation of a high-fidelity visual experience. However, adequately combining the visual and the auditory experience provides a powerful tool to enhance sensory processing and modulate attention. We present the design and usability testing of an auditory-visual interactive environment for investigating VR exposure-based treatment for cynophobia.
View Article and Find Full Text PDFPurpose: This study was designed to investigate methods to help patients suffering from unilateral tinnitus synthesizing an auditory replica of their tinnitus.
Materials And Methods: Two semi-automatic methods (A and B) derived from the auditory threshold of the patient and a method (C) combining a pure tone and a narrow band-pass noise centred on an adjustable frequency were devised and rated on their likeness over two test sessions. A third test evaluated the stability over time of the synthesized tinnitus replica built with method C, and its proneness to merge with the patient's tinnitus.
Cynophobia (dog phobia) has both visual and auditory relevant components. In order to investigate the efficacy of virtual reality (VR) exposure-based treatment for cynophobia, we studied the efficiency of auditory-visual environments in generating presence and emotion. We conducted an evaluation test with healthy participants sensitive to cynophobia in order to assess the capacity of auditory-visual virtual environments (VE) to generate fear reactions.
View Article and Find Full Text PDFStud Health Technol Inform
January 2013
Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.
View Article and Find Full Text PDFProg Neuropsychopharmacol Biol Psychiatry
August 2011
Complaints related to dizziness, balance problems and spatial disorientation in psychiatry have seldom been considered as a possible manifestation of a distorted multisensory integrative ability. Several kinds of mismatches among simultaneous sensory information are encountered in everyday life but despite these, the central nervous system usually manages to update the internal representation of the body in the surrounding space. In some cases, a sensory mismatch may elicit an erroneous perception of the body in space, resulting in anxiety, dizziness and balance problems.
View Article and Find Full Text PDFIt has been speculated that superstitiousness and obsessive-compulsive disorder (OCD) exist along a continuum. The distinction between superstitious behavior and superstitious belief, however, is crucial for any theoretical account of claimed associations between superstitiousness and OCD. By demonstrating that there is a dichotomy between behavior and belief, which is experimentally testable, we can differentiate superstitious behavior from superstitious belief, or magical ideation.
View Article and Find Full Text PDFBackground: Recognizing an object requires binding together several cues, which may be distributed across different sensory modalities, and ignoring competing information originating from other objects. In addition, knowledge of the semantic category of an object is fundamental to determine how we should react to it. Here we investigate the role of semantic categories in the processing of auditory-visual objects.
View Article and Find Full Text PDFRecognizing a natural object requires one to pool information from various sensory modalities, and to ignore information from competing objects. That the same semantic knowledge can be accessed through different modalities makes it possible to explore the retrieval of supramodal object concepts. Here, object-recognition processes were investigated by manipulating the relationships between sensory modalities, specifically, semantic content, and spatial alignment between auditory and visual information.
View Article and Find Full Text PDFThe subjective experience conferred by auditory perception has rarely been addressed outside of the studies of auditory hallucinations. The aim of this study is to describe the phenomenology of auditory experiences in individuals who endorse magical beliefs, but do not report hallucinations. We examined the relationship between subjective auditory sensitivity and a 'psychotic-like' thinking style.
View Article and Find Full Text PDFVisual hemineglect, the failure to explore the half of space, real or imagined, contralateral to a cerebral lesion with respect to body or head, can be seen as an illustration of the brain's Euclidean representation of the left/right axis. Here we present two patients with left-sided neglect, in whom only the left hemispace in front of an imagined and/or real body position was inaccessible, but the space behind them remained fully represented. These observations suggest that of the three Euclidean dimensions (up/down, left/right, and front/back), at least the latter two are modularly and separately represented in the human brain.
View Article and Find Full Text PDFThe primary aim of this study was to evaluate the effect of auditory feedback in a VR system planned for clinical use and to address the different factors that should be taken into account in building a bimodal virtual environment (VE). We conducted an experiment in which we assessed spatial performances in agoraphobic patients and normal subjects comparing two kinds of VEs, visual alone (Vis) and auditory-visual (AVis), during separate sessions. Subjects were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town.
View Article and Find Full Text PDFAfter exposure to a consistent spatial disparity of auditory and visual stimuli, subjective localization of sound sources is usually shifted in the direction of the visual stimuli. This study investigates whether such aftereffects can be observed in humans after exposure to a conflicting bimodal stimulation in virtual reality and whether these aftereffects are confined to the trained locations. Fourteen subjects participated in an adaptation experiment, in which auditory stimuli were convolved with non-individual head-related transfer functions, delivered via headphones.
View Article and Find Full Text PDFHemispheric specialization is reliably demonstrated in patients with unilateral lesions or disconnected hemispheres, but is inconsistent in healthy populations. The reason for this paradox is unclear. We propose that functional hemispheric specialization in healthy participants depends upon functional brain states at stimulus arrival (FBS).
View Article and Find Full Text PDFBrain Res Cogn Brain Res
June 2002
Studies of visual-vestibular and vestibular-proprioceptive interactions suggest that prolonged exposure to sensory conflicts induces a modification of the relation between sensory modalities for self-motion perception. With most models conflicts are solved by a weighting process. However, the brain could also switch between conflicting cues.
View Article and Find Full Text PDF