Reservoir computing is a neuromorphic architecture that may offer viable solutions to the growing energy costs of machine learning. In software-based machine learning, computing performance can be readily reconfigured to suit different computational tasks by tuning hyperparameters. This critical functionality is missing in 'physical' reservoir computing schemes that exploit nonlinear and history-dependent responses of physical systems for data processing. Here we overcome this issue with a 'task-adaptive' approach to physical reservoir computing. By leveraging a thermodynamical phase space to reconfigure key reservoir properties, we optimize computational performance across a diverse task set. We use the spin-wave spectra of the chiral magnet CuOSeO that hosts skyrmion, conical and helical magnetic phases, providing on-demand access to different computational reservoir responses. The task-adaptive approach is applicable to a wide variety of physical systems, which we show in other chiral magnets via above (and near) room-temperature demonstrations in CoZnMn (and FeGe).

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10769874PMC
http://dx.doi.org/10.1038/s41563-023-01698-8DOI Listing

Publication Analysis

Top Keywords

reservoir computing
16
physical reservoir
8
machine learning
8
physical systems
8
reservoir
6
computing
5
task-adaptive physical
4
computing reservoir
4
computing neuromorphic
4
neuromorphic architecture
4

Similar Publications

The latent viral reservoir remains the major barrier to HIV cure, placing the burden of strict adherence to antiretroviral therapy (ART) on people living with HIV to prevent recrudescence of viremia. For infants with perinatally acquired HIV, adherence is anticipated to be a lifelong need. In this study, we tested the hypothesis that administration of ART and viral Envelope-specific rhesus-derived IgG1 monoclonal antibodies (RhmAbs) with or without the IL-15 superagonist N-803 early in infection would limit viral reservoir establishment in SIV-infected infant rhesus macaques.

View Article and Find Full Text PDF

Harnessing spatiotemporal transformation in magnetic domains for nonvolatile physical reservoir computing.

Sci Adv

January 2025

Institute of Materials Research and Engineering (IMRE), Agency for Science Technology and Research (A*STAR), 2 Fusionopolis Way, #08-03 Innovis, Singapore 138634, Republic of Singapore.

Combining physics with computational models is increasingly recognized for enhancing the performance and energy efficiency in neural networks. Physical reservoir computing uses material dynamics of physical substrates for temporal data processing. Despite the ease of training, building an efficient reservoir remains challenging.

View Article and Find Full Text PDF

Replay as a Basis for Backpropagation Through Time in the Brain.

Neural Comput

January 2025

Department of Psychological and Brain Sciences, Indiana University Bloomington, Bloomington, IN 47405, U.S.A.

How episodic memories are formed in the brain is a continuing puzzle for the neuroscience community. The brain areas that are critical for episodic learning (e.g.

View Article and Find Full Text PDF

Objectives: The size, shape, and contractility of the heart's atrial chambers have not been evaluated in fetuses with growth restriction (FGR) or who are small-for-gestational-age (SGA) as defined by the Delphi consensus protocol. This study aimed to examine the atrial chambers using speckle tracking analysis to identify any changes that may be specific for either growth disturbance.

Methods: Sixty-three fetuses were evaluated with an estimated fetal weight <10th percentile who were classified as FGR or SGA based on the Delphi consensus protocol.

View Article and Find Full Text PDF

Improving the performance of echo state networks through state feedback.

Neural Netw

December 2024

Wyant College of Optical Sciences, University of Arizona, Tuscon, AZ, USA. Electronic address:

Reservoir computing, using nonlinear dynamical systems, offers a cost-effective alternative to neural networks for complex tasks involving processing of sequential data, time series modeling, and system identification. Echo state networks (ESNs), a type of reservoir computer, mirror neural networks but simplify training. They apply fixed, random linear transformations to the internal state, followed by nonlinear changes.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!