For many years, the dominant theoretical framework guiding research into the neural origins of perceptual experience has been provided by hierarchical feedforward models, in which sensory inputs are passed through a series of increasingly complex feature detectors. However, the long-standing orthodoxy of these accounts has recently been challenged by a radically different set of theories that contend that perception arises from a purely inferential process supported by two distinct classes of neurons: those that transmit predictions about sensory states and those that signal sensory information that deviates from those predictions. Although these predictive processing (PP) models have become increasingly influential in cognitive neuroscience, they are also criticized for lacking the empirical support to justify their status.
View Article and Find Full Text PDFBoth spatial and temporal context influence our perception of visual stimuli. For instance, both nearby moving stimuli and recently viewed motion can lead to biases in the perceived direction of a moving stimulus. Due to similarities in the spatial tuning properties of these spatial and temporal context-dependent effects, it is often assumed that they share a functional goal in motion processing and arise from common neural mechanisms.
View Article and Find Full Text PDF