Small universal spiking neural P systems with dendritic/axonal delays and dendritic trunk/feedback.

Neural Netw

Instituto Politécnico Nacional ESIME Culhuacan, Av. Santana 1000, Coyoacan, 04260, Ciudad de México, Mexico.

Published: June 2021

AI Article Synopsis

  • The text discusses spiking neural P (SNP) systems, which represent how neurons communicate through spikes and highlights the role of dendritic trees in learning and memory.
  • It introduces a new variant called DACSN P systems, which incorporates biological features like dendritic feedback and delays to improve computational performance while using fewer resources.
  • The study demonstrates that DACSN P systems can universally replicate any Turing computable function, establishing their significance in computational theory with a practical example of a small universal SNP system.

Article Abstract

In spiking neural P (SN P) systems, neurons are interconnected by means of synapses, and they use spikes to communicate with each other. However, in biology, the complex structure of dendritic tree is also an important part in the communication scheme between neurons since these structures are linked to advanced neural process such as learning and memory formation. In this work, we present a new variant of the SN P systems inspired by diverse dendrite and axon phenomena such as dendritic feedback, dendritic trunk, dendritic delays and axonal delays, respectively. This new variant is referred to as a spiking neural P system with dendritic and axonal computation (DACSN P system). Specifically, we include experimentally proven biological features in the current SN P systems to reduce the computational complexity of the soma by providing it with stable firing patterns through dendritic delays, dendritic feedback and axonal delays. As a consequence, the proposed DACSN P systems use the minimum number of synapses and neurons with simple and homogeneous standard spiking rules. Here, we study the computational capabilities of a DACSN P system. In particular, we prove that DACSN P systems with dendritic and axonal behavior are universal as both number-accepting/generating devices. In addition, we constructed a small universal SN P system using 39 neurons with standard spiking rules to compute any Turing computable function.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2021.02.010DOI Listing

Publication Analysis

Top Keywords

spiking neural
12
dendritic
9
small universal
8
neural systems
8
delays dendritic
8
dendritic feedback
8
dendritic delays
8
axonal delays
8
dendritic axonal
8
dacsn system
8

Similar Publications

Adaptive behavior depends on the ability to predict specific events, particularly those related to rewards. Armed with such associative information, we can infer the current value of predicted rewards based on changing circumstances and desires. To support this ability, neural systems must represent both the value and identity of predicted rewards, and these representations must be updated when they change.

View Article and Find Full Text PDF

Traffic classification in SDN-based IoT network using two-level fused network with self-adaptive manta ray foraging.

Sci Rep

January 2025

Department of Computer Science, College of Computer and Information Sciences, Majmaah University, 11952, Al-Majmaah, Saudi Arabia.

The rapid expansion of IoT networks, combined with the flexibility of Software-Defined Networking (SDN), has significantly increased the complexity of traffic management, requiring accurate classification to ensure optimal quality of service (QoS). Existing traffic classification techniques often rely on manual feature selection, limiting adaptability and efficiency in dynamic environments. This paper presents a novel traffic classification framework for SDN-based IoT networks, introducing a Two-Level Fused Network integrated with a self-adaptive Manta Ray Foraging Optimization (SMRFO) algorithm.

View Article and Find Full Text PDF

Model optimization is a problem of great concern and challenge for developing an image classification model. In image classification, selecting the appropriate hyperparameters can substantially boost the model's ability to learn intricate patterns and features from complex image data. Hyperparameter optimization helps to prevent overfitting by finding the right balance between complexity and generalization of a model.

View Article and Find Full Text PDF

We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches: per-synapse delay learning via Dilated Convolutions with Learnable Spacings (DCLS) and a dynamic pruning strategy that also serves as a form of delay learning. In the latter approach, the network dynamically selects and prunes connections, optimizing the delays in sparse connectivity settings. We evaluate both approaches on the Raw Heidelberg Digits keyword spotting benchmark using Backpropagation Through Time with surrogate gradients.

View Article and Find Full Text PDF

The pursuit of artificial neural networks that mirror the accuracy, efficiency and low latency of biological neural networks remains a cornerstone of artificial intelligence (AI) research. Here, we incorporated recent neuroscientific findings of self-inhibiting autapse and neuron heterogeneity for innovating a spiking neural network (SNN) with enhanced learning and memorizing capacities. A bi-level programming paradigm was formulated to respectively learn neuron-level biophysical variables and network-level synapse weights for nested heterogeneous learning.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!