From Pavlov Conditioning to Hebb Learning.

Neural Comput

Sapienza University of Rome, Department of Mathematics, 00185, Rome, Italy.

Published: April 2023

Hebb's learning traces its origin in Pavlov's classical conditioning; however, while the former has been extensively modeled in the past decades (e.g., by the Hopfield model and countless variations on theme), as for the latter, modeling has remained largely unaddressed so far. Furthermore, a mathematical bridge connecting these two pillars is totally lacking. The main difficulty toward this goal lies in the intrinsically different scales of the information involved: Pavlov's theory is about correlations between concepts that are (dynamically) stored in the synaptic matrix as exemplified by the celebrated experiment starring a dog and a ringing bell; conversely, Hebb's theory is about correlations between pairs of neurons as summarized by the famous statement that neurons that fire together wire together. In this letter, we rely on stochastic process theory to prove that as long as we keep neurons' and synapses' timescales largely split, Pavlov's mechanism spontaneously takes place and ultimately gives rise to synaptic weights that recover the Hebbian kernel.

Download full-text PDF

Source
http://dx.doi.org/10.1162/neco_a_01578DOI Listing

Publication Analysis

Top Keywords

theory correlations
8
pavlov conditioning
4
conditioning hebb
4
hebb learning
4
learning hebb's
4
hebb's learning
4
learning traces
4
traces origin
4
origin pavlov's
4
pavlov's classical
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!