Encoding natural scenes with neural circuits with random thresholds.

Vision Res

Department of Electrical Engineering, Columbia University, New York, NY 10027, USA.

Published: October 2010

We present a general framework for the reconstruction of natural video scenes encoded with a population of spiking neural circuits with random thresholds. The natural scenes are modeled as space-time functions that belong to a space of trigonometric polynomials. The visual encoding system consists of a bank of filters, modeling the visual receptive fields, in cascade with a population of neural circuits, modeling encoding in the early visual system. The neuron models considered include integrate-and-fire neurons and ON-OFF neuron pairs with threshold-and-fire spiking mechanisms. All thresholds are assumed to be random. We demonstrate that neural spiking is akin to taking noisy measurements on the stimulus both for time-varying and space-time-varying stimuli. We formulate the reconstruction problem as the minimization of a suitable cost functional in a finite-dimensional vector space and provide an explicit algorithm for stimulus recovery. We also present a general solution using the theory of smoothing splines in Reproducing Kernel Hilbert Spaces. We provide examples of both synthetic video as well as for natural scenes and demonstrate that the quality of the reconstruction degrades gracefully as the threshold variability of the neurons increases.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.visres.2010.03.015DOI Listing

Publication Analysis

Top Keywords

natural scenes
12
neural circuits
12
circuits random
8
random thresholds
8
encoding natural
4
scenes
4
neural
4
scenes neural
4
thresholds general
4
general framework
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!