We investigate the qualitative properties of a recurrent neural network (RNN) for minimizing a nonlinear continuously differentiable and convex objective function over any given nonempty, closed, and convex subset which may be bounded or unbounded, by exploiting some key inequalities in mathematical programming. The global existence and boundedness of the solution of the RNN are proved when the objective function is convex and has a nonempty constrained minimum set. Under the same assumption, the RNN is shown to be globally convergent in the sense that every trajectory of the RNN converges to some equilibrium point of the RNN. If the objective function itself is uniformly convex and its gradient vector is a locally Lipschitz continuous mapping, then the RNN is globally exponentially convergent in the sense that every trajectory of the RNN converges to the unique equilibrium point of the RNN exponentially. These qualitative properties of the RNN render the network model well suitable for solving the convex minimization over any given nonempty, closed, and convex subset, no matter whether the given constrained subset is bounded or not.

Download full-text PDF

Source
http://dx.doi.org/10.1109/72.963790DOI Listing

Publication Analysis

Top Keywords

nonempty closed
12
closed convex
12
convex subset
12
objective function
12
rnn
9
recurrent neural
8
neural network
8
nonlinear continuously
8
continuously differentiable
8
convex
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!