Parallel Synapses with Transmission Nonlinearities Enhance Neuronal Classification Capacity.

bioRxiv

Department of Neurobiology, School of Biological Sciences, University of California San Diego, La Jolla, CA 92093, USA.

Published: July 2024

Cortical neurons often establish multiple synaptic contacts with the same postsynaptic neuron. To avoid functional redundancy of these parallel synapses, it is crucial that each synapse exhibits distinct computational properties. Here we model the current to the soma contributed by each synapse as a sigmoidal transmission function of its presynaptic input, with learnable parameters such as amplitude, slope, and threshold. We evaluate the classification capacity of a neuron equipped with such nonlinear parallel synapses, and show that with a small number of parallel synapses per axon, it substantially exceeds that of the Perceptron. Furthermore, the number of correctly classified data points can increase superlinearly as the number of presynaptic axons grows. When training with an unrestricted number of parallel synapses, our model neuron can effectively implement an arbitrary aggregate transmission function for each axon, constrained only by monotonicity. Nevertheless, successful learning in the model neuron often requires only a small number of parallel synapses. We also apply these parallel synapses in a feedforward neural network trained to classify MNIST images, and show that they can increase the test accuracy. This demonstrates that multiple nonlinear synapses per input axon can substantially enhance a neuron's computational power.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11244940PMC
http://dx.doi.org/10.1101/2024.07.01.601490DOI Listing

Publication Analysis

Top Keywords

parallel synapses
28
number parallel
12
classification capacity
8
transmission function
8
small number
8
model neuron
8
parallel
7
synapses
7
number
5
synapses transmission
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!