A major problem in natural vision is how neurons in the early visual system encode the widely varying visual input with the limited dynamic range of their activity. Recent experiments suggest that retinal neurons adapt their response not only to the temporal mean but also to the temporal variance of the visual input. Inspired by these results, we propose a simple model in which temporal adaptation can be achieved by a transformation consisting of a linear filtering followed by a variance normalisation. We show that such transformation efficiently adapts to the temporal statistics of natural time series of intensities by removing most of its redundancy, while no linear transformation alone achieves the same goal. Results reproduce important features of temporal adaptation in real vision.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/s0042-6989(03)00312-2 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!