Convergence in the presence of multiple equilibrium points is one of the most fundamental dynamical properties of a neural network (NN). Goal of the paper is to investigate convergence for the classic Brain-State-in-a-Box (BSB) NN model and some of its relevant generalizations named Brain-State-in-a-Convex-Body (BSCB). In particular, BSCB is a class of discrete-time NNs obtained by projecting a linear system onto a convex body of R.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
October 2024
The article considers a large class of delayed neural networks (NNs) with extended memristors obeying the Stanford model. This is a widely used and popular model that accurately describes the switching dynamics of real nonvolatile memristor devices implemented in nanotechnology. The article studies via the Lyapunov method complete stability (CS), i.
View Article and Find Full Text PDFThis article introduces a new class of memristor neural networks (NNs) for solving, in real-time, quadratic programming (QP) and linear programming (LP) problems. The networks, which are called memristor programming NNs (MPNNs), use a set of filamentary-type memristors with sharp memristance transitions for constraint satisfaction and an additional set of memristors with smooth memristance transitions for memorizing the result of a computation. The nonlinear dynamics and global optimization capabilities of MPNNs for QP and LP problems are thoroughly investigated via a recently introduced technique called the flux-charge analysis method.
View Article and Find Full Text PDFThe paper introduces a class of memristor neural networks (NNs) that are characterized by the following salient features. (a) The processing of signals takes place in the flux-charge domain and is based on the time evolution of memristor charges. The processing result is given by the constant asymptotic values of charges that are stored in the memristors acting as non-volatile memories in steady state.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
May 2018
Recent papers in the literature introduced a class of neural networks (NNs) with memristors, named dynamic-memristor (DM) NNs, such that the analog processing takes place in the charge-flux domain, instead of the typical current-voltage domain as it happens for Hopfield NNs and standard cellular NNs. One key advantage is that, when a steady state is reached, all currents, voltages, and power of a DM-NN drop off, whereas the memristors act as nonvolatile memories that store the processing result. Previous work in the literature addressed multistability of DM-NNs, i.
View Article and Find Full Text PDFIEEE Trans Cybern
October 2017
Recent work has considered a class of cellular neural networks (CNNs) where each cell contains an ideal capacitor and an ideal flux-controlled memristor. One main feature is that during the analog computation the memristor is assumed to be a dynamic element, hence each cell is second-order with state variables given by the capacitor voltage and the memristor flux. Such CNNs, named dynamic memristor (DM)-CNNs, were proved to be convergent when a symmetry condition for the cell interconnections is satisfied.
View Article and Find Full Text PDFIEEE Trans Cybern
November 2016
This paper considers a class of nonsmooth neural networks with discontinuous hard-limiter (signum) neuron activations for solving time-dependent (TD) systems of algebraic linear equations (ALEs). The networks are defined by the subdifferential with respect to the state variables of an energy function given by the L norm of the error between the state and the TD-ALE solution. It is shown that when the penalty parameter exceeds a quantitatively estimated threshold the networks are able to reach in finite time, and exactly track thereafter, the target solution of the TD-ALE.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
February 2016
This paper introduces a nonsmooth (NS) neural network that is able to operate in a time-dependent (TD) context and is potentially useful for solving some classes of NS-TD problems. The proposed network is named nonsmooth time-dependent network (NTN) and is an extension to a TD setting of a previous NS neural network for programming problems. Suppose C(t), t ≥ 0, is a nonempty TD convex feasibility set defined by TD inequality constraints.
View Article and Find Full Text PDFThe paper considers nonsmooth neural networks described by a class of differential inclusions termed differential variational inequalities (DVIs). The DVIs include the relevant class of neural networks, introduced by Li, Michel and Porod, described by linear systems evolving in a closed hypercube of R(n). The main result in the paper is a necessary and sufficient condition for multistability of DVIs with nonsymmetric and cooperative (nonnegative) interconnections between neurons.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
September 2012
Recent papers have pointed out the interest to study convergence in the presence of multiple equilibrium points (EPs) (multistability) for neural networks (NNs) with nonsymmetric cooperative (nonnegative) interconnections and neuron activations modeled by piecewise linear (PL) functions. One basic difficulty is that the semiflows generated by such NNs are monotone but, due to the horizontal segments in the PL functions, are not eventually strongly monotone (ESM). This notwithstanding, it has been shown that there are subclasses of irreducible interconnection matrices for which the semiflows, although they are not ESM, enjoy convergence properties similar to those of ESM semiflows.
View Article and Find Full Text PDFIEEE Trans Neural Netw
April 2011
This brief considers a class of delayed full-range (FR) cellular neural networks (CNNs) with uncertain interconnections between neurons modeled by means of intervalized matrices. Using mathematical tools from the theory of differential inclusions, a fundamental result on global robust stability of standard (S) CNNs is extended to prove global robust exponential stability for the corresponding class (same interconnection weights and inputs) of FR-CNNs. The result is of theoretical interest since, in general, the equivalence between the dynamical behavior of FR-CNNs and S-CNNs is not guaranteed.
View Article and Find Full Text PDF