Publications by authors named "Roger Koenig-Robert"

Despite the dramatic rise of surveillance in our societies, only limited research has examined its effects on humans. While most research has focused on voluntary behaviour, no study has examined the effects of surveillance on more fundamental and automatic aspects of human perceptual awareness and cognition. Here, we show that being watched on CCTV markedly impacts a hardwired and involuntary function of human sensory perception-the ability to consciously detect faces.

View Article and Find Full Text PDF

The rapid transformation of sensory inputs into meaningful neural representations is critical to adaptive human behaviour. While non-invasive neuroimaging methods are the de-facto method for investigating neural representations, they remain expensive, not widely available, time-consuming, and restrictive. Here we show that movement trajectories can be used to measure emerging neural representations with fine temporal resolution.

View Article and Find Full Text PDF

Recent research suggests imagery is functionally equivalent to a weak form of visual perception. Here we report evidence across five independent experiments on adults that perception and imagery are supported by fundamentally different mechanisms: Whereas perceptual representations are largely formed via increases in excitatory activity, imagery representations are largely supported by modulating nonimagined content. We developed two behavioral techniques that allowed us to first put the visual system into a state of adaptation and then probe the additivity of perception and imagery.

View Article and Find Full Text PDF

Subliminal information can influence our conscious life. Subliminal stimuli can influence cognitive tasks, while endogenous subliminal neural information can sway decisions before volition. Are decisions inextricably biased towards subliminal information? Or can they diverge away from subliminal biases via training? We report that implicit bias training can remove biases from subliminal sensory primes.

View Article and Find Full Text PDF

Despite the past few decades of research providing convincing evidence of the similarities in function and neural mechanisms between imagery and perception, for most of us, the experience of the two are undeniably different, why? Here, we review and discuss the differences between imagery and perception and the possible underlying causes of these differences, from function to neural mechanisms. Specifically, we discuss the directional flow of information (top-down versus bottom-up), the differences in targeted cortical layers in primary visual cortex and possible different neural mechanisms of modulation versus excitation. For the first time in history, neuroscience is beginning to shed light on this long-held mystery of why imagery and perception look and feel so different.

View Article and Find Full Text PDF

Controlling our thoughts is central to mental well-being, and its failure is at the crux of a number of mental disorders. Paradoxically, behavioral evidence shows that thought suppression often fails. Despite the broad importance of understanding the mechanisms of thought control, little is known about the fate of neural representations of suppressed thoughts.

View Article and Find Full Text PDF

Perception likely results from the interplay between sensory information and top-down signals. In this electroencephalography (EEG) study, we utilised the hierarchical frequency tagging (HFT) method to examine how such integration is modulated by expectation and attention. Using intermodulation (IM) components as a measure of nonlinear signal integration, we show in three different experiments that both expectation and attention enhance integration between top-down and bottom-up signals.

View Article and Find Full Text PDF

The ability to control one's thoughts is crucial for attention, focus, ideation, and mental well-being. Although there is a long history of research into thought control, the inherent subjectivity of thoughts has made objective examination, and thus mechanistic understanding, difficult. Here, we report a novel method to objectively investigate thought-control success and failure by measuring the sensory strength of visual thoughts using binocular rivalry, a perceptual illusion.

View Article and Find Full Text PDF

Is it possible to predict the freely chosen content of voluntary imagery from prior neural signals? Here we show that the content and strength of future voluntary imagery can be decoded from activity patterns in visual and frontal areas well before participants engage in voluntary imagery. Participants freely chose which of two images to imagine. Using functional magnetic resonance (fMRI) and multi-voxel pattern analysis, we decoded imagery content as far as 11 seconds before the voluntary decision, in visual, frontal and subcortical areas.

View Article and Find Full Text PDF

There is a growing understanding that both top-down and bottom-up signals underlie perception. But it is not known how these signals integrate with each other and how this depends on the perceived stimuli's predictability. 'Predictive coding' theories describe this integration in terms of how well top-down predictions fit with bottom-up sensory input.

View Article and Find Full Text PDF

Primate visual systems process natural images in a hierarchical manner: at the early stage, neurons are tuned to local image features, while neurons in high-level areas are tuned to abstract object categories. Standard models of visual processing assume that the transition of tuning from image features to object categories emerges gradually along the visual hierarchy. Direct tests of such models remain difficult due to confounding alteration in low-level image properties when contrasting distinct object categories.

View Article and Find Full Text PDF

Isolating the neural correlates of object recognition and studying their fine temporal dynamics have been a great challenge in neuroscience. A major obstacle has been the difficulty to dissociate low-level feature extraction from the actual object recognition activity. Here we present a new technique called semantic wavelet-induced frequency-tagging (SWIFT), where cyclic wavelet-scrambling allowed us to isolate neural correlates of object recognition from low-level feature extraction in humans using EEG.

View Article and Find Full Text PDF

The spatial distribution and the temporal dynamics of attention are well understood in isolation, but their interaction remains an open question. How does the shape of the attentional focus evolve over time? To answer this question, we measured spatiotemporal maps of endogenous and exogenous attention in humans (more than 140,000 trials in 23 subjects). We tested the visibility of a low-contrast target presented (50 ms) at different spatial distances and temporal delays from a cue in a noisy background.

View Article and Find Full Text PDF

Background: Comparative studies of cognitive processes find similarities between humans and apes but also monkeys. Even high-level processes, like the ability to categorize classes of object from any natural scene under ultra-rapid time constraints, seem to be present in rhesus macaque monkeys (despite a smaller brain and the lack of language and a cultural background). An interesting and still open question concerns the degree to which the same images are treated with the same efficacy by humans and monkeys when a low level cue, the spatial frequency content, is controlled.

View Article and Find Full Text PDF

The cytoskeleton and cytoskeletal motors play a fundamental role in neurotransmitter receptor trafficking, but proteins that link GABA(B) receptors (GABA(B)Rs) to the cytoskeleton have not been described. We recently identified Marlin-1, a protein that interacts with GABA(B)R1. Here, we explore the association of GABA(B)Rs and Marlin-1 to the cytoskeleton using a combination of biochemistry, microscopy and live cell imaging.

View Article and Find Full Text PDF