A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

A robust event-driven approach to always-on object recognition. | LitMetric

A robust event-driven approach to always-on object recognition.

Neural Netw

Aix-Marseille Universit, Institut de Neurosciences de la Timone, CNRS, Marseille, France. Electronic address:

Published: October 2024

AI Article Synopsis

  • The proposed neuromimetic architecture enables constant pattern recognition using an enhanced event-based algorithm called Hierarchy Of Time-Surfaces (HOTS), built from data from a neuromorphic camera.
  • Improvements include homeostatic gain control to boost learning of patterns and a new mathematical model that relates HOTS to Spiking Neural Networks (SNN), transforming it into an online event-driven classifier.
  • Validation on datasets like Poker-DVS, N-MNIST, and DVS Gesture shows that this architecture excels at rapid object recognition through real-time event processing.

Article Abstract

We propose a neuromimetic architecture capable of always-on pattern recognition, i.e. at any time during processing. To achieve this, we have extended an existing event-based algorithm (Lagorce et al., 2017), which introduced novel spatio-temporal features as a Hierarchy Of Time-Surfaces (HOTS). Built from asynchronous events captured by a neuromorphic camera, these time surfaces allow to encode the local dynamics of a visual scene and to create an efficient event-based pattern recognition architecture. Inspired by neuroscience, we have extended this method to improve its performance. First, we add a homeostatic gain control on the activity of neurons to improve the learning of spatio-temporal patterns (Grimaldi et al., 2021). We also provide a new mathematical formalism that allows an analogy to be drawn between the HOTS algorithm and Spiking Neural Networks (SNN). Following this analogy, we transform the offline pattern categorization method into an online and event-driven layer. This classifier uses the spiking output of the network to define new time surfaces and we then perform the online classification with a neuromimetic implementation of a multinomial logistic regression. These improvements not only consistently increase the performance of the network, but also bring this event-driven pattern recognition algorithm fully online. The results have been validated on different datasets: Poker-DVS (Serrano-Gotarredona and Linares-Barranco, 2015), N-MNIST (Orchard, Jayawant et al., 2015) and DVS Gesture (Amir et al., 2017). This demonstrates the efficiency of this bio-realistic SNN for ultra-fast object recognition through an event-by-event categorization process.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2024.106415DOI Listing

Publication Analysis

Top Keywords

pattern recognition
12
object recognition
8
time surfaces
8
recognition
5
robust event-driven
4
event-driven approach
4
approach always-on
4
always-on object
4
recognition propose
4
propose neuromimetic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!