Echo state networks (ESNs) are time series processing models working under the echo state property (ESP) principle. The ESP is a notion of stability that imposes an asymptotic fading of the memory of the input. On the other hand, the resulting inherent architectural bias of ESNs may lead to an excessive loss of information, which in turn harms the performance in certain tasks with long short-term memory requirements.
View Article and Find Full Text PDFIn this paper, we provide a novel approach to the architectural design of deep Recurrent Neural Networks using signal frequency analysis. In particular, focusing on the Reservoir Computing framework and inspired by the principles related to the inherent effect of layering, we address a fundamental open issue in deep learning, namely the question of how to establish the number of layers in recurrent architectures in the form of deep echo state networks (DeepESNs). The proposed method is first analyzed and refined on a controlled scenario and then it is experimentally assessed on challenging real-world tasks.
View Article and Find Full Text PDFEcho State Networks (ESNs) constitute an emerging approach for efficiently modeling Recurrent Neural Networks (RNNs). In this paper we investigate some of the main aspects that can be accounted for the success and limitations of this class of models. In particular, we propose complementary classes of factors related to contractivity and architecture of reservoirs and we study their relative relevance.
View Article and Find Full Text PDF