Accelerating spiking neural network simulations with PymoNNto and PymoNNtorch.

Front Neuroinform

Department of Mathematics, Statistics, and Computer Science - College of Science, University of Tehran, Tehran, Iran.

Published: February 2024

AI Article Synopsis

  • Spiking neural network simulations are key in fields like Computational Neuroscience, AI, and Neuromorphic Engineering, with various simulators available for different applications.
  • PymoNNto is a new Python toolbox for spiking neural networks that allows users to embed custom code flexibly, operating with a NumPy backend and GPU support.
  • PymoNNtorch builds on this by using PyTorch for better performance, showing faster results than traditional simulators like NEST and Brian 2 through its optimized GPU capabilities and modular design.

Article Abstract

Spiking neural network simulations are a central tool in Computational Neuroscience, Artificial Intelligence, and Neuromorphic Engineering research. A broad range of simulators and software frameworks for such simulations exist with different target application areas. Among these, PymoNNto is a recent Python-based toolbox for spiking neural network simulations that emphasizes the embedding of custom code in a modular and flexible way. While PymoNNto already supports GPU implementations, its backend relies on NumPy operations. Here we introduce PymoNNtorch, which is natively implemented with PyTorch while retaining PymoNNto's modular design. Furthermore, we demonstrate how changes to the implementations of common network operations in combination with PymoNNtorch's native GPU support can offer speed-up over conventional simulators like NEST, ANNarchy, and Brian 2 in certain situations. Overall, we show how PymoNNto's modular and flexible design in combination with PymoNNtorch's GPU acceleration and optimized indexing operations facilitate research and development of spiking neural networks in the Python programming language.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10913591PMC
http://dx.doi.org/10.3389/fninf.2024.1331220DOI Listing

Publication Analysis

Top Keywords

spiking neural
16
neural network
12
network simulations
12
modular flexible
8
pymonnto's modular
8
combination pymonntorch's
8
accelerating spiking
4
neural
4
network
4
simulations
4

Similar Publications

Hardware neural networks could perform certain computational tasks orders of magnitude more energy-efficiently than conventional computers. Artificial neurons are a key component of these networks and are currently implemented with electronic circuits based on capacitors and transistors. However, artificial neurons based on memristive devices are a promising alternative, owing to their potentially smaller size and inherent stochasticity.

View Article and Find Full Text PDF

Bioinspired Nanofluidic Circuits with Integrating Excitatory and Inhibitory Synapses.

Nano Lett

January 2025

State Key Laboratory of Physical Chemistry of Solid Surfaces, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005, China.

Brain neural networks intricately integrate excitatory and inhibitory synaptic potentials to modulate the generation or suppression of action potentials, laying the foundation for neuronal computation. Although bioinspired nanofluidic systems have replicated some synaptic functions, complete integration of postsynaptic potentials remains unachieved. In this work, the developed ion concentration gradient nanofluidic memristor (ICGNM) modulates memristive effects through ion concentration gradient adjustments and exhibits synaptic plasticity phenomena, including paired-pulse facilitation, paired-pulse depression, and spike-rate-dependent plasticity.

View Article and Find Full Text PDF

Towards parameter-free attentional spiking neural networks.

Neural Netw

January 2025

Department of Information Technology, Ghent University, Gent, Belgium. Electronic address:

Brain-inspired spiking neural networks (SNNs) are increasingly explored for their potential in spatiotemporal information modeling and energy efficiency on emerging neuromorphic hardware. Recent works incorporate attentional modules into SNNs, greatly enhancing their capabilities in handling sequential data. However, these parameterized attentional modules have placed a huge burden on memory consumption, a factor that is constrained on neuromorphic chips.

View Article and Find Full Text PDF

Cortical coding of gustatory and thermal signals in active licking mice.

J Physiol

January 2025

Department of Biological Science, Programs in Neuroscience, Molecular Biophysics and Cell and Molecular Biology, Florida State University, Tallahassee, Florida, USA.

Eating behaviours are influenced by the integration of gustatory, olfactory and somatosensory signals, which all contribute to the perception of flavour. Although extensive research has explored the neural correlates of taste in the gustatory cortex (GC), less is known about its role in encoding thermal information. This study investigates the encoding of oral thermal and chemosensory signals by GC neurons compared to the oral somatosensory cortex.

View Article and Find Full Text PDF

Our purpose was to compare the influence of the spectral content of motor unit recordings on the calculation of electromechanical delay and on the prediction of force fluctuations from measures of the variability in discharge times and neural drive during steady isometric contractions with the first dorsal interosseus muscle. Participants ( = 42; 60 ± 13 yrs) performed contractions at 5% and 20% MVC. After satisfying inclusion criteria, high-density surface EMG recordings from a subset of 23 participants were decomposed into the discharge times of 530 motor units.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!