Publications by authors named "C Lucibello"

Recent generalizations of the Hopfield model of associative memories are able to store a number P of random patterns that grows exponentially with the number N of neurons, P=exp(αN). Besides the huge storage capacity, another interesting feature of these networks is their connection to the attention mechanism which is part of the Transformer architecture widely applied in deep learning. In this work, we study a generic family of pattern ensembles using a statistical mechanics analysis which gives exact asymptotic thresholds for the retrieval of a typical pattern, α_{1}, and lower bounds for the maximum of the load α for which all patterns can be retrieved, α_{c}, as well as sizes of attraction basins.

View Article and Find Full Text PDF

Processing faces accurately and efficiently is a key capability of humans and other animals that engage in sophisticated social tasks. Recent studies reported a decoupled coding for faces in the primate inferotemporal cortex, with two separate neural populations coding for the geometric position of (texture-free) facial landmarks and for the image texture at fixed landmark positions, respectively. Here, we formally assess the efficiency of this decoupled coding by appealing to the information-theoretic notion of description length, which quantifies the amount of information that is saved when encoding novel facial images, with a given precision.

View Article and Find Full Text PDF

The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name random-features Hopfield model. Here, P binary patterns of length N are generated by applying to Gaussian vectors sampled in a latent space of dimension D a random projection followed by a nonlinearity.

View Article and Find Full Text PDF

Empirical studies on the landscape of neural networks have shown that low-energy configurations are often found in complex connected structures, where zero-energy paths between pairs of distant solutions can be constructed. Here, we consider the spherical negative perceptron, a prototypical nonconvex neural network model framed as a continuous constraint satisfaction problem. We introduce a general analytical method for computing energy barriers in the simplex with vertex configurations sampled from the equilibrium.

View Article and Find Full Text PDF

We present a comparison between various algorithms of inference of covariance and precision matrices in small data sets of real vectors of the typical length and dimension of human brain activity time series retrieved by functional magnetic resonance imaging (fMRI). Assuming a Gaussian model underlying the neural activity, the problem consists of denoising the empirically observed matrices to obtain a better estimator of the (unknown) true precision and covariance matrices. We consider several standard noise-cleaning algorithms and compare them on two types of data sets.

View Article and Find Full Text PDF