The significance of intracellular recording in neurophysiology is emphasized in this article, with considering the functions of neurons, particularly the role of first spike latency in response to external stimuli. The study employs advanced machine learning techniques to predict first spike latency from whole cell patch recording data. Experiments were conducted on Control (Salin) and Experiment (Harmaline) groups, generating a dataset for developing predictive models. Because the dataset has a limited number of samples, we utilized models that are effective with small datasets. Among different groups of regression models (linear, ensemble, and tree models), the ensemble models, specifically the LGB method, can achieve better performance. The results demonstrate accurate prediction of first spike latency, with an average mean squared error of 0.0002 and mean absolute error of 0.01 in 10-fold cross-validation. The research suggests the potential of machine learning in forecasting the first spike latency, allowing reliable estimation without the need for extensive animal testing. This intelligent predictive system facilitates efficient analysis of first spike latency changes in both healthy and unhealthy brain cells, streamlining experimentation and providing more detailed insights into the captured signals.

Download full-text PDF

Source
http://dx.doi.org/10.3233/SHTI240531DOI Listing

Publication Analysis

Top Keywords

spike latency
24
predict spike
8
latency response
8
response external
8
machine learning
8
spike
6
latency
6
models
5
comparison regression
4
regression methods
4

Similar Publications

Toward a Free-Response Paradigm of Decision-Making in Spiking Neural Networks.

Neural Comput

January 2025

Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, 200437, China

Spiking neural networks (SNNs) have attracted significant interest in the development of brain-inspired computing systems due to their energy efficiency and similarities to biological information processing. In contrast to continuous-valued artificial neural networks, which produce results in a single step, SNNs require multiple steps during inference to achieve a desired accuracy level, resulting in a burden in real-time response and energy efficiency. Inspired by the tradeoff between speed and accuracy in human and animal decision-making processes, which exhibit correlations among reaction times, task complexity, and decision confidence, an inquiry emerges regarding how an SNN model can benefit by implementing these attributes.

View Article and Find Full Text PDF

Robotic systems rely on spatio-temporal information to solve control tasks. With advancements in deep neural networks, reinforcement learning has significantly enhanced the performance of control tasks by leveraging deep learning techniques. However, as deep neural networks grow in complexity, they consume more energy and introduce greater latency.

View Article and Find Full Text PDF

Backgrounds: Intramuscular mRNA vaccines against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) have a low intensity and latency of antibody response in patients with muscular disorders (MDs). However, the mechanisms involved in this phenomenon remain unknown. This study aimed to clarify the mechanism of the low immunogenicity of intramuscular SARS-CoV-2 mRNA vaccination in patients with MDs.

View Article and Find Full Text PDF
Article Synopsis
  • Researchers aim to create artificial neural networks that match the performance of biological networks, focusing on accuracy, efficiency, and low latency.
  • They developed a new spiking neural network (SNN) using concepts from neuroscience, including self-inhibiting autapse and neuron diversity, to improve learning and memory capabilities.
  • The new SNN model demonstrated superior performance, achieving higher accuracy, energy efficiency, and reduced latency in various AI tasks, and successfully identified rare cell types linked to severe brain diseases.
View Article and Find Full Text PDF

Spiking Neural Networks (SNNs) are at the forefront of computational neuroscience, emulating the nuanced dynamics of biological systems. In the realm of SNN training methods, the conversion from ANNs to SNNs has generated significant interest due to its potential for creating energy-efficient and biologically plausible models. However, existing conversion methods often require long time-steps to ensure that the converted SNNs achieve performance comparable to the original ANNs.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!