Temporal-kernel recurrent neural networks.

Neural Netw

Department of Computer Science, University of Toronto, Toronto, Canada.

Published: March 2010

A Recurrent Neural Network (RNN) is a powerful connectionist model that can be applied to many challenging sequential problems, including problems that naturally arise in language and speech. However, RNNs are extremely hard to train on problems that have long-term dependencies, where it is necessary to remember events for many timesteps before using them to make a prediction. In this paper we consider the problem of training RNNs to predict sequences that exhibit significant long-term dependencies, focusing on a serial recall task where the RNN needs to remember a sequence of characters for a large number of steps before reconstructing it. We introduce the Temporal-Kernel Recurrent Neural Network (TKRNN), which is a variant of the RNN that can cope with long-term dependencies much more easily than a standard RNN, and show that the TKRNN develops short-term memory that successfully solves the serial recall task by representing the input string with a stable state of its hidden units.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2009.10.009DOI Listing

Publication Analysis

Top Keywords

recurrent neural
12
long-term dependencies
12
temporal-kernel recurrent
8
neural network
8
serial recall
8
recall task
8
neural networks
4
networks recurrent
4
rnn
4
network rnn
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!