The paper deals with investigating approximation abilities of a special class of discrete-time dynamic neural networks. The networks considered are called locally recurrent globally feed-forward, because they are designed with dynamic neuron models which contain inner feedbacks, but interconnections between neurons are strict feed-forward ones like in the well-known multi-layer perceptron. The paper presents analytical results showing that a locally recurrent network with two hidden layers is able to approximate a state-space trajectory produced by any Lipschitz continuous function with arbitrary accuracy. Moreover, based on these results, the network can be simplified and transformed into a more practical structure needed in real world applications.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neunet.2007.10.004 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!