Artificial neural networks (ANNs), such as the convolutional neural network (CNN) and long short-term memory (LSTM), have high complexity and contain large numbers of parameters. Memristor-based neural networks, which have the ability of in-memory and parallel computing, are therefore proposed to accelerate the operations of ANNs. In this paper, a memristor-based hardware realization of long short-term memory (LSTM) network with in situ training is presented. The designed memristor-based LSTM (MbLSTM) network is composed of memristor-based LSTM cell and memristor-based dense layer. Sigmoid and tanh (hyperbolic tangent) activation functions are approximately implemented through intentionally designing circuit parameters. A weight update scheme with row-parallel characteristic is put forward to update the conductance of memristors in crossbars. The highlights of MbLSTM include an effective hardware-based inference process and in situ training. The validity of MbLSTM is substantiated through classification tasks. The robustness of MbLSTM to conductance variations is also analyzed.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neunet.2020.07.035 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!