Learning long-term dependences (LTDs) with recurrent neural networks (RNNs) is challenging due to their limited internal memories. In this paper, we propose a new external memory architecture for RNNs called an external addressable long-term and working memory (EALWM)-augmented RNN. This architecture has two distinct advantages over existing neural external memory architectures, namely the division of the external memory into two parts-long-term memory and working memory-with both addressable and the capability to learn LTDs without suffering from vanishing gradients with necessary assumptions. The experimental results on algorithm learning, language modeling, and question answering demonstrate that the proposed neural memory architecture is promising for practical applications.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2019.2910302DOI Listing

Publication Analysis

Top Keywords

external memory
12
recurrent neural
8
neural networks
8
external addressable
8
addressable long-term
8
long-term working
8
working memory
8
learning long-term
8
long-term dependences
8
memory architecture
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!