FPGA-based multimodal embedded sensor system integrating low- and mid-level vision.

Sensors (Basel)

Department of Computer Architectures and Automatic Control, Complutense University of Madrid, 28040 Madrid, Spain.

Published: June 2012

Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3231703PMC
http://dx.doi.org/10.3390/s110808164DOI Listing

Publication Analysis

Top Keywords

mid-level vision
8
motion estimation
8
computational resources
8
bioinspired sensor
8
optical flow
8
fpga-based multimodal
4
multimodal embedded
4
sensor
4
embedded sensor
4
sensor system
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!