When imitating biological sensors, we have not completely understood the early processing of the input to reproduce artificially. Building hybrid systems with both artificial and real biological components is a promising solution. For example, when a dragonfly is used as a living sensor, the early processing of visual information is performed fully in the brain of the dragonfly. The only significant remaining tasks are recording and processing neural signals in software and/or hardware. Based on existing works which focused on recording neural signals, this paper proposes a software application of neural information processing to design a visual processing module for dragonfly hybrid bio-robots. After a neural signal is recorded in real-time, the action potentials can be detected and matched with predefined templates to detect when and which descending neurons fire. The output of the proposed system will be used to control other parts of the robot platform.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/EMBC.2014.6943926 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!