IEEE Trans Neural Netw Learn Syst
March 2020
Learning long-term dependences (LTDs) with recurrent neural networks (RNNs) is challenging due to their limited internal memories. In this paper, we propose a new external memory architecture for RNNs called an external addressable long-term and working memory (EALWM)-augmented RNN. This architecture has two distinct advantages over existing neural external memory architectures, namely the division of the external memory into two parts-long-term memory and working memory-with both addressable and the capability to learn LTDs without suffering from vanishing gradients with necessary assumptions.
View Article and Find Full Text PDF