In this work, we employ reservoir computing, a recently developed machine learning technique, to predict the time evolution of neuronal activity produced by the Hindmarsh-Rose neuronal model. Our results show accurate short- and long-term predictions for periodic (tonic and bursting) neuronal behaviors, but only short-term accurate predictions for chaotic neuronal states. However, after the accuracy of the short-term predictability deteriorates in the chaotic regime, the predicted output continues to display similarities with the actual neuronal behavior. This is reinforced by a striking resemblance between the bifurcation diagrams of the actual and of the predicted outputs. Error analyses of the reservoir's performance are consistent with standard results previously obtained.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1063/1.5119723 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!