The paper deals with investigating approximation abilities of a special class of discrete-time dynamic neural networks. The networks considered are called locally recurrent globally feed-forward, because they are designed with dynamic neuron models which contain inner feedbacks, but interconnections between neurons are strict feed-forward ones like in the well-known multi-layer perceptron. The paper presents analytical results showing that a locally recurrent network with two hidden layers is able to approximate a state-space trajectory produced by any Lipschitz continuous function with arbitrary accuracy. Moreover, based on these results, the network can be simplified and transformed into a more practical structure needed in real world applications.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2007.10.004DOI Listing

Publication Analysis

Top Keywords

locally recurrent
12
recurrent globally
8
globally feed-forward
8
neural networks
8
approximation state-space
4
state-space trajectories
4
trajectories locally
4
feed-forward neural
4
networks paper
4
paper deals
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!