The intrinsic variability of switching behavior in memristors has been a major obstacle to their adoption as the next generation of universal memory. On the other hand, this natural stochasticity can be valuable for hardware security applications. Here we propose and demonstrate a novel true random number generator utilizing the stochastic delay time of threshold switching in a Ag:SiO diffusive memristor, which exhibits evident advantages in scalability, circuit complexity, and power consumption. The random bits generated by the diffusive memristor true random number generator pass all 15 NIST randomness tests without any post-processing, a first for memristive-switching true random number generators. Based on nanoparticle dynamic simulation and analytical estimates, we attribute the stochasticity in delay time to the probabilistic process by which Ag particles detach from a Ag reservoir. This work paves the way for memristors in hardware security applications for the era of the Internet of Things.Memristors can switch between high and low electrical-resistance states, but the switching behaviour can be unpredictable. Here, the authors harness this unpredictability to develop a memristor-based true random number generator that uses the stochastic delay time of threshold switching.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5638922PMC
http://dx.doi.org/10.1038/s41467-017-00869-xDOI Listing

Publication Analysis

Top Keywords

true random
20
random number
20
number generator
16
diffusive memristor
12
delay time
12
novel true
8
hardware security
8
security applications
8
stochastic delay
8
time threshold
8

Similar Publications

Does randomization assert the balance across trial arms? Revisiting Worrall's criticism.

Hist Philos Life Sci

January 2025

Faculty of Philosophy, Institute of Philosophy, Jagiellonian University, Grodzka 52, Kraków, Poland.

We revisit John Worrall's old but still prominent argument against the view that randomization balances the impact of both known and unknown confounders across the treatment and control arms. We argue that his argument involving indefinitely many possible confounders is at odds with statistical theory as it (1) presumes that the purpose of randomized studies is obtaining perfect point estimates for which perfect balance is needed; (2) mistakes equalizing each confounder with the overall (average) impact of all confounders, and (3) assumes that the joint effect of an infinite series of confounders cannot be bounded. We defend the role of randomization in balancing the impact of confounders across the treatment and control arms by putting forward the statistical sense of the balance claim.

View Article and Find Full Text PDF

Improved analysis of supervised learning in the RKHS with random features: Beyond least squares.

Neural Netw

January 2025

City University of Hong Kong Shenzhen Research Institute, Shenzhen, China; Department of Mathematics, City University of Hong Kong, Hong Kong, China. Electronic address:

We consider kernel-based supervised learning using random Fourier features, focusing on its statistical error bounds and generalization properties with general loss functions. Beyond the least squares loss, existing results only demonstrate worst-case analysis with rate n and the number of features at least comparable to n, and refined-case analysis where it can achieve almost n rate when the kernel's eigenvalue decay is exponential and the number of features is again at least comparable to n. For the least squares loss, the results are much richer and the optimal rates can be achieved under the source and capacity assumptions, with the number of features smaller than n.

View Article and Find Full Text PDF

Purpose: To investigate the effect of average intraocular pressure (IOP) on the true rate of glaucoma progression (RoP) in the United Kingdom Glaucoma Treatment Study (UKGTS).

Methods: UKGTS participants were randomized to placebo or Latanoprost drops and monitored for up to two years with visual field tests (VF, 24-2 SITA standard), IOP measurements, and optic nerve imaging. We included eyes with at least three structural or functional assessments (VF with <15% false-positive errors).

View Article and Find Full Text PDF

The challenge of imaging low-density objects in an electron microscope without causing beam damage is significant in modern transmission electron microscopy. This is especially true for life science imaging, where the sample, rather than the instrument, still determines the resolution limit. Here, we explore whether we have to accept this or can progress further in this area.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!