Information in conventional digital computing platforms is encoded in the steady states of transistors and processed in a quasi-static way. Memristors are a class of emerging devices that naturally embody dynamics through their internal electrophyiscal processes, enabling nonconventional computing paradigms with enhanced capability and energy efficiency, such as reservoir computing. Here, we report on a dynamic memristor based on LiNbO. The device has nonlinear I-V characteristics and exhibits short-term memory, suitable for application in reservoir computing. By time multiplexing, a single device can serve as a reservoir with rich dynamics which used to require a large number of interconnected nodes. The collective states of five memristors after the application of trains of pulses to the respective memristors are unique for each combination of pulse patterns, which is suitable for sequence data classification, as demonstrated in a 5 × 4 digit image recognition task. This work broadens the spectrum of memristive materials for neuromorphic computing.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10126362 | PMC |
http://dx.doi.org/10.3389/fnins.2023.1177118 | DOI Listing |
Lithofacies classification and identification are of great significance in the exploration and evaluation of tight sandstone reservoirs. Existing methods of lithofacies identification in tight sandstone reservoirs face issues such as lengthy manual classification, strong subjectivity of identification, and insufficient sample datasets, which make it challenging to analyze the lithofacies characteristics of these reservoirs during oil and gas exploration. In this paper, the Fuyu oil formation in the Songliao Basin is selected as the target area, and an intelligent method for recognizing the lithophysics reservoirs in tight sandstone based on hybrid multilayer perceptron (MLP) and multivariate time series (MTS-Mixers) is proposed.
View Article and Find Full Text PDFSci Rep
December 2024
Department of Applied Mathematics, Tokyo University of Science, Shinjuku, Tokyo, 162-8601, Japan.
Reservoir computing is a machine learning framework that exploits nonlinear dynamics, exhibiting significant computational capabilities. One of the defining characteristics of reservoir computing is that only linear output, given by a linear combination of reservoir variables, is trained. Inspired by recent mathematical studies of generalized synchronization, we propose a novel reservoir computing framework with a generalized readout, including a nonlinear combination of reservoir variables.
View Article and Find Full Text PDFSci Rep
December 2024
Department of Mathematics, Payame Noor University, Tehran, Iran.
In the realm of petroleum extraction, well productivity declines as reservoirs deplete, eventually reaching a point where continued extraction becomes economically unfeasible. To counteract this, artificial lift techniques are employed, with gas injection being a prevalent method. Ideally, unrestricted gas injection could maximize oil output.
View Article and Find Full Text PDFTrop Med Infect Dis
December 2024
Evolutionary Ecology Group, Department of Biology, University of Antwerp, Campus Drie Eiken, Universiteitsplein 1, Wilrijk, 2610 Antwerp, Belgium.
is a vector of , the causative agent of cutaneous leishmaniasis. This study assessed the abundance and distribution of in different habitats and human houses situated at varying distances from hyrax (reservoir host) dwellings, in Wolaita Zone, southern Ethiopia. Sandflies were collected from January 2020 to December 2021 using CDC light traps, sticky paper traps, and locally made emergence traps.
View Article and Find Full Text PDFBiomimetics (Basel)
December 2024
IDLab-AIRO, Faculty of Engineering and Architecture, Ghent University, 9052 Ghent, Belgium.
The performance of echo state networks (ESNs) in temporal pattern learning tasks depends both on their memory capacity (MC) and their non-linear processing. It has been shown that linear memory capacity is maximized when ESN neurons have linear activation, and that a trade-off between non-linearity and linear memory capacity is required for temporal pattern learning tasks. The more recent distance-based delay networks (DDNs) have shown improved memory capacity over ESNs in several benchmark temporal pattern learning tasks.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!