AI Article Synopsis

  • Developmental plasticity is key to understanding how the brain adapts and changes structure during learning, but current neural network compression methods inadequately mimic these biological processes.
  • A new method called developmental plasticity-inspired adaptive pruning (DPAP) is introduced, which mimics biological pruning mechanisms in the brain and optimizes network structure dynamically during learning.
  • Experiments demonstrate that the DPAP approach significantly enhances performance and speed in compressed deep artificial neural networks and spiking neural networks, achieving state-of-the-art results in certain tasks.

Article Abstract

Developmental plasticity plays a prominent role in shaping the brain's structure during ongoing learning in response to dynamically changing environments. However, the existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from brain's developmental plasticity mechanisms, thus limiting their ability to learn efficiently, rapidly, and accurately. This paper proposed a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons according to the "use it or lose it, gradually decay" principle. The proposed DPAP model considers multiple biologically realistic mechanisms (such as dendritic spine dynamic plasticity, activitydependent neural spiking trace, and local synaptic plasticity), with additional adaptive pruning strategy, so that the network structure can be dynamically optimized during learning without any pre-training and retraining. Extensive comparative experiments show consistent and remarkable performance and speed boost with the extremely compressed networks on a diverse set of benchmark tasks for deep ANNs and SNNs, especially the spatio-temporal joint pruning of SNNs in neuromorphic datasets. This work explores how developmental plasticity enables complex deep networks to gradually evolve into brain-like efficient and compact structures, eventually achieving state-of-the-art (SOTA) performance for biologically realistic SNNs.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TPAMI.2024.3467268DOI Listing

Publication Analysis

Top Keywords

adaptive pruning
12
neural networks
12
developmental plasticity
12
developmental plasticity-inspired
8
plasticity-inspired adaptive
8
artificial neural
8
biologically realistic
8
developmental
6
pruning
5
networks
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!