Publications by authors named "Maxence Ernoult"

The brain naturally binds events from different sources in unique concepts. It is hypothesized that this process occurs through the transient mutual synchronization of neurons located in different regions of the brain when the stimulus is presented. This mechanism of 'binding through synchronization' can be directly implemented in neural networks composed of coupled oscillators.

View Article and Find Full Text PDF

While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviors do not transfer directly to mitigate catastrophic forgetting in deep neural networks.

View Article and Find Full Text PDF

Finding spike-based learning algorithms that can be implemented within the local constraints of neuromorphic systems, while achieving high accuracy, remains a formidable challenge. Equilibrium propagation is a promising alternative to backpropagation as it only involves local computations, but hardware-oriented studies have so far focused on rate-based networks. In this work, we develop a spiking neural network algorithm called EqSpike, compatible with neuromorphic systems, which learns by equilibrium propagation.

View Article and Find Full Text PDF

Equilibrium Propagation is a biologically-inspired algorithm that trains convergent recurrent neural networks with a local learning rule. This approach constitutes a major lead to allow learning-capable neuromophic systems and comes with strong theoretical guarantees. Equilibrium propagation operates in two phases, during which the network is let to evolve freely and then "nudged" toward a target; the weights of the network are then updated based solely on the states of the neurons that they connect.

View Article and Find Full Text PDF

One of the biggest stakes in nanoelectronics today is to meet the needs of Artificial Intelligence by designing hardware neural networks which, by fusing computation and memory, process and learn from data with limited energy. For this purpose, memristive devices are excellent candidates to emulate synapses. A challenge, however, is to map existing learning algorithms onto a chip: for a physical implementation, a learning rule should ideally be tolerant to the typical intrinsic imperfections of such memristive devices, and local.

View Article and Find Full Text PDF

In recent years, artificial neural networks have become the flagship algorithm of artificial intelligence. In these systems, neuron activation functions are static, and computing is achieved through standard arithmetic operations. By contrast, a prominent branch of neuroinspired computing embraces the dynamical nature of the brain and proposes to endow each component of a neural network with dynamical functionality, such as oscillations, and to rely on emergent physical phenomena, such as synchronization, for solving complex problems with small networks.

View Article and Find Full Text PDF

A recent theoretical breakthrough has brought a new tool, called the localization landscape, for predicting the localization regions of vibration modes in complex or disordered systems. Here, we report on the first experiment which measures the localization landscape and demonstrates its predictive power. Holographic measurement of the static deformation under uniform load of a thin plate with complex geometry provides direct access to the landscape function.

View Article and Find Full Text PDF