Meta-learning synaptic plasticity and memory addressing for continual familiarity detection.

Neuron

Zuckerman Mind Brain Behavior Institute, Department of Neuroscience, Columbia University, New York, NY 10027, USA; Department of Physiology and Cellular Biophysics, Columbia University, New York, NY 10027, USA. Electronic address:

Published: February 2022

Over the course of a lifetime, we process a continual stream of information. Extracted from this stream, memories must be efficiently encoded and stored in an addressable manner for retrieval. To explore potential mechanisms, we consider a familiarity detection task in which a subject reports whether an image has been previously encountered. We design a feedforward network endowed with synaptic plasticity and an addressing matrix, meta-learned to optimize familiarity detection over long intervals. We find that anti-Hebbian plasticity leads to better performance than Hebbian plasticity and replicates experimental results such as repetition suppression. A combinatorial addressing function emerges, selecting a unique neuron as an index into the synaptic memory matrix for storage or retrieval. Unlike previous models, this network operates continuously and generalizes to intervals it has not been trained on. Our work suggests a biologically plausible mechanism for continual learning and demonstrates an effective application of machine learning for neuroscience discovery.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8813911PMC
http://dx.doi.org/10.1016/j.neuron.2021.11.009DOI Listing

Publication Analysis

Top Keywords

familiarity detection
12
synaptic plasticity
8
meta-learning synaptic
4
plasticity
4
plasticity memory
4
memory addressing
4
addressing continual
4
continual familiarity
4
detection course
4
course lifetime
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!