Hebb's learning traces its origin in Pavlov's classical conditioning; however, while the former has been extensively modeled in the past decades (e.g., by the Hopfield model and countless variations on theme), as for the latter, modeling has remained largely unaddressed so far.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
March 2023
A crucial challenge in medicine is choosing which drug (or combination) will be the most advantageous for a particular patient. Usually, drug response rates differ substantially, and the reasons for this response unpredictability remain ambiguous. Consequently, it is central to classify features that contribute to the observed drug response variability.
View Article and Find Full Text PDFIEEE Trans Neural Netw Learn Syst
June 2022
Inspired by a formal equivalence between the Hopfield model and restricted Boltzmann machines (RBMs), we design a Boltzmann machine, referred to as the dreaming Boltzmann machine (DBM), which achieves better performances than the standard one. The novelty in our model lies in a precise prescription for intralayer connections among hidden neurons whose strengths depend on features correlations. We analyze learning and retrieving capabilities in DBMs, both theoretically and numerically, and compare them to the RBM reference.
View Article and Find Full Text PDFIn this work we apply statistical mechanics tools to infer cardiac pathologies over a sample of M patients whose heart rate variability has been recorded via 24 h Holter device and that are divided in different classes according to their clinical status (providing a repository of labelled data). Considering the set of inter-beat interval sequences [Formula: see text], with [Formula: see text], we estimate their probability distribution [Formula: see text] exploiting the maximum entropy principle. By setting constraints on the first and on the second moment we obtain an effective pairwise [Formula: see text] model, whose parameters are shown to depend on the clinical status of the patient.
View Article and Find Full Text PDFIn this paper we develop statistical algorithms to infer possible cardiac pathologies, based on data collected from 24 h Holter recording over a sample of 2829 labelled patients; labels highlight whether a patient is suffering from cardiac pathologies. In the first part of the work we analyze statistically the heart-beat series associated to each patient and we work them out to get a coarse-grained description of heart variability in terms of 49 markers well established in the reference community. These markers are then used as inputs for a multi-layer feed-forward neural network that we train in order to make it able to classify patients.
View Article and Find Full Text PDFIn this work we develop analytical techniques to investigate a broad class of associative neural networks set in the high-storage regime. These techniques translate the original statistical-mechanical problem into an analytical-mechanical one which implies solving a set of partial differential equations, rather than tackling the canonical probabilistic route. We test the method on the classical Hopfield model - where the cost function includes only two-body interactions (i.
View Article and Find Full Text PDFWe consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P>2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold.
View Article and Find Full Text PDFThe standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼0.14, far from the theoretical bound for symmetric networks, i.e.
View Article and Find Full Text PDFWe propose a modification of the cost function of the Hopfield model whose salient features shine in its Taylor expansion and result in more than pairwise interactions with alternate signs, suggesting a unified framework for handling both with deep learning and network pruning. In our analysis, we heavily rely on the Hamilton-Jacobi correspondence relating the statistical model with a mechanical system. In this picture, our model is nothing but the relativistic extension of the original Hopfield model (whose cost function is a quadratic form in the Mattis magnetization and mimics the non-relativistic counterpart, the so-called classical limit).
View Article and Find Full Text PDF