Machine learning software applications are ubiquitous in many fields of science and society for their outstanding capability to solve computationally vast problems like the recognition of patterns and regularities in big data sets. In spite of these impressive achievements, such processors are still based on the so-called von Neumann architecture, which is a bottleneck for faster and power-efficient neuromorphic computation. Therefore, one of the main goals of research is to conceive physical realizations of artificial neural networks capable of performing fully parallel and ultrafast operations. Here we show that lattices of exciton-polariton condensates accomplish neuromorphic computing with outstanding accuracy thanks to their high optical nonlinearity. We demonstrate that our neural network significantly increases the recognition efficiency compared with the linear classification algorithms on one of the most widely used benchmarks, the MNIST problem, showing a concrete advantage from the integration of optical systems in neural network architectures.

Download full-text PDF

Source
http://dx.doi.org/10.1021/acs.nanolett.0c00435DOI Listing

Publication Analysis

Top Keywords

neuromorphic computing
8
neural network
8
polaritonic neuromorphic
4
computing outperforms
4
outperforms linear
4
linear classifiers
4
classifiers machine
4
machine learning
4
learning software
4
software applications
4

Similar Publications

Mimicking Axon Growth and Pruning by Photocatalytic Growth and Chemical Dissolution of Gold on Titanium Dioxide Patterns.

Molecules

December 2024

Chair for Integrated Systems and Photonics, Department of Electrical and Information Engineering, Faculty of Engineering, Kiel University, Kaiserstr. 2, 24143 Kiel, Germany.

Biological neural circuits are based on the interplay of excitatory and inhibitory events to achieve functionality. Axons form long-range information highways in neural circuits. Axon pruning, i.

View Article and Find Full Text PDF

Harnessing spatiotemporal transformation in magnetic domains for nonvolatile physical reservoir computing.

Sci Adv

January 2025

Institute of Materials Research and Engineering (IMRE), Agency for Science Technology and Research (A*STAR), 2 Fusionopolis Way, #08-03 Innovis, Singapore 138634, Republic of Singapore.

Combining physics with computational models is increasingly recognized for enhancing the performance and energy efficiency in neural networks. Physical reservoir computing uses material dynamics of physical substrates for temporal data processing. Despite the ease of training, building an efficient reservoir remains challenging.

View Article and Find Full Text PDF

Non-volatile electronic memory elements are very attractive for applications, not only for information storage but also in logic circuits, sensing devices and neuromorphic computing. Here, a ferroelectric film of guanine nucleobase is used in a resistive memory junction sandwiched between two different ferromagnetic films of Co and CoCr alloys. The magnetic films have an in-plane easy axis of magnetization and different coercive fields whereas the guanine film ensures a very long spin transport length, at 100 K.

View Article and Find Full Text PDF

Resistive memory-based zero-shot liquid state machine for multimodal event data learning.

Nat Comput Sci

January 2025

Key Lab of Fabrication Technologies for Integrated Circuits and Key Laboratory of Microelectronic Devices and Integrated Technology, Institute of Microelectronics of the Chinese Academy of Sciences, Beijing, China.

The human brain is a complex spiking neural network (SNN) capable of learning multimodal signals in a zero-shot manner by generalizing existing knowledge. Remarkably, it maintains minimal power consumption through event-based signal propagation. However, replicating the human brain in neuromorphic hardware presents both hardware and software challenges.

View Article and Find Full Text PDF

Improving Recall Accuracy in Sparse Associative Memories That Use Neurogenesis.

Neural Comput

January 2025

Electronics and Computer Science, University of Southampton, Southampton SO17 1BJ, U.K.

The creation of future low-power neuromorphic solutions requires specialist spiking neural network (SNN) algorithms that are optimized for neuromorphic settings. One such algorithmic challenge is the ability to recall learned patterns from their noisy variants. Solutions to this problem may be required to memorize vast numbers of patterns based on limited training data and subsequently recall the patterns in the presence of noise.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!