The weight-updating methods have played an important role in improving the performance of neural networks. To ameliorate the oscillating phenomenon in training radial basis function (RBF) neural network, a fractional order gradient descent with momentum method for updating the weights of RBF neural network (FOGDM-RBF) is proposed for data classification. Its convergence is proved. In order to speed up the convergence process, an adaptive learning rate is used to adjust the training process. The Iris data set and MNIST data set are used to test the proposed algorithm. The results verify the theoretical results of the proposed algorithm such as its monotonicity and convergence. Some non-parametric statistical tests such as Friedman test and Quade test are taken for the comparison of the proposed algorithm with other algorithms. The influence of fractional order, learning rate and batch size is analysed and compared. Error analysis shows that the algorithm can effectively accelerate the convergence speed of gradient descent method and improve its performance with high accuracy and validity.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/0954898X.2020.1849842 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!