Tipping prediction of a class of large-scale radial-ring neural networks.

Neural Netw

School of Computer, Data and Mathematical Sciences, Western Sydney University, Sydney, NSW 2751, Australia. Electronic address:

Published: January 2025

AI Article Synopsis

  • - This paper addresses the complexity of understanding how collective dynamics emerge in large neural networks by utilizing dynamical systems theory and focusing on tipping mechanisms in these systems.
  • - A new radial-ring neural network model is introduced to derive the network's characteristic equation and assess stability, revealing critical factors like synaptic delay and self-feedback that affect the network's behavior.
  • - The research demonstrates that the larger radial-ring neural network is more robust than smaller networks, with findings showing how changes in factors like activation functions and network topology impact the network's dynamics and tipping points.

Article Abstract

Understanding the emergence and evolution of collective dynamics in large-scale neural networks remains a complex challenge. This paper seeks to address this gap by applying dynamical systems theory, with a particular focus on tipping mechanisms. First, we introduce a novel (n+mn)-scale radial-ring neural network and employ Coates' flow graph topological approach to derive the characteristic equation of the linearized network. Second, through deriving stability conditions and predicting the tipping point using an algebraic approach based on the integral element concept, we identify critical factors such as the synaptic transmission delay, the self-feedback coefficient, and the network topology. Finally, we validate the methodology's effectiveness in predicting the tipping point. The findings reveal that increased synaptic transmission delay can induce and amplify periodic oscillations. Additionally, the self-feedback coefficient and the network topology influence the onset of tipping points. Moreover, the selection of activation function impacts both the number of equilibrium solutions and the convergence speed of the neural network. Lastly, we demonstrate that the proposed large-scale radial-ring neural network exhibits stronger robustness compared to lower-scale networks with a single topology. The results provide a comprehensive depiction of the dynamics observed in large-scale neural networks under the influence of various factor combinations.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2024.106820DOI Listing

Publication Analysis

Top Keywords

radial-ring neural
12
neural networks
12
neural network
12
large-scale radial-ring
8
large-scale neural
8
predicting tipping
8
tipping point
8
synaptic transmission
8
transmission delay
8
self-feedback coefficient
8

Similar Publications

Tipping prediction of a class of large-scale radial-ring neural networks.

Neural Netw

January 2025

School of Computer, Data and Mathematical Sciences, Western Sydney University, Sydney, NSW 2751, Australia. Electronic address:

Article Synopsis
  • - This paper addresses the complexity of understanding how collective dynamics emerge in large neural networks by utilizing dynamical systems theory and focusing on tipping mechanisms in these systems.
  • - A new radial-ring neural network model is introduced to derive the network's characteristic equation and assess stability, revealing critical factors like synaptic delay and self-feedback that affect the network's behavior.
  • - The research demonstrates that the larger radial-ring neural network is more robust than smaller networks, with findings showing how changes in factors like activation functions and network topology impact the network's dynamics and tipping points.
View Article and Find Full Text PDF

For decades, studying the dynamic performances of artificial neural networks (ANNs) is widely considered to be a good way to gain a deeper insight into actual neural networks. However, most models of ANNs are focused on a finite number of neurons and a single topology. These studies are inconsistent with actual neural networks composed of thousands of neurons and sophisticated topologies.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!