Publications by authors named "Robin Blume-Kohout"

Scalable quantum processors require high-fidelity universal quantum logic operations in a manufacturable physical platform. Donors in silicon provide atomic size, excellent quantum coherence and compatibility with standard semiconductor processing, but no entanglement between donor-bound electron spins has been demonstrated to date. Here we present the experimental demonstration and tomography of universal one- and two-qubit gates in a system of two weakly exchange-coupled electrons, bound to single phosphorus donors introduced in silicon by ion implantation.

View Article and Find Full Text PDF

The performance of quantum gates is often assessed using some form of randomized benchmarking. However, the existing methods become infeasible for more than approximately five qubits. Here we show how to use a simple and customizable class of circuits-randomized mirror circuits-to perform scalable, robust, and flexible randomized benchmarking of Clifford gates.

View Article and Find Full Text PDF

Nuclear spins were among the first physical platforms to be considered for quantum information processing, because of their exceptional quantum coherence and atomic-scale footprint. However, their full potential for quantum computing has not yet been realized, owing to the lack of methods with which to link nuclear qubits within a scalable device combined with multi-qubit operations with sufficient fidelity to sustain fault-tolerant quantum computation. Here we demonstrate universal quantum logic operations using a pair of ion-implanted P donor nuclei in a silicon nanoelectronic device.

View Article and Find Full Text PDF

If quantum information processors are to fulfill their potential, the diverse errors that affect them must be understood and suppressed. But errors typically fluctuate over time, and the most widely used tools for characterizing them assume static error modes and rates. This mismatch can cause unheralded failures, misidentified error modes, and wasted experimental effort.

View Article and Find Full Text PDF

Benchmarking methods that can be adapted to multiqubit systems are essential for assessing the overall or "holistic" performance of nascent quantum processors. The current industry standard is Clifford randomized benchmarking (RB), which measures a single error rate that quantifies overall performance. But, scaling Clifford RB to many qubits is surprisingly hard.

View Article and Find Full Text PDF

Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts.

View Article and Find Full Text PDF

Quantum information processors promise fast algorithms for problems inaccessible to classical computers. But since qubits are noisy and error-prone, they will depend on fault-tolerant quantum error correction (FTQEC) to compute reliably. Quantum error correction can protect against general noise if-and only if-the error in each physical qubit operation is smaller than a certain threshold.

View Article and Find Full Text PDF

A minimax estimator has the minimum possible error ("risk") in the worst case. We construct the first minimax estimators for quantum state tomography with relative entropy risk. The minimax risk of nonadaptive tomography scales as O(1/sqrt[N])-in contrast to that of classical probability estimation, which is O(1/N)-where N is the number of copies of the quantum state used.

View Article and Find Full Text PDF

Quantum simulations promise to be one of the primary applications of quantum computers, should one be constructed. This article briefly summarizes the history of quantum simulation in light of the recent result of Wang and co-workers, demonstrating calculation of the ground and excited states for a HeH(+) molecule, and concludes with a discussion of why this and other recent progress in the field suggest that quantum simulations of quantum chemistry have a bright future.

View Article and Find Full Text PDF

An intuitive realization of a qubit is an electron charge at two well-defined positions of a double quantum dot. This qubit is simple and has the potential for high-speed operation because of its strong coupling to electric fields. However, charge noise also couples strongly to this qubit, resulting in rapid dephasing at all but one special operating point called the 'sweet spot'.

View Article and Find Full Text PDF

We introduce a simple protocol for adaptive quantum state tomography, which reduces the worst-case infidelity [1-F(ρ,ρ)] between the estimate and the true state from O(1/sqrt[N]) to O(1/N). It uses a single adaptation step and just one extra measurement setting. In a linear optical qubit experiment, we demonstrate a full order of magnitude reduction in infidelity (from 0.

View Article and Find Full Text PDF

This Letter proposes and analyzes a new method for quantum state estimation, called hedged maximum likelihood (HMLE). HMLE is a quantum version of Lidstone's law, also known as the "add β" rule. A straightforward modification of maximum likelihood estimation (MLE), it can be used as a plug-in replacement for MLE.

View Article and Find Full Text PDF

Suppose an experimentalist wishes to verify that his apparatus produces entangled quantum states. A finite amount of data cannot conclusively demonstrate entanglement, so drawing conclusions from real-world data requires statistical reasoning. We propose a reliable method to quantify the weight of evidence for (or against) entanglement, based on a likelihood ratio test.

View Article and Find Full Text PDF

Quantum Darwinism--the redundant encoding of information about a decohering system in its environment--was proposed to reconcile the quantum nature of our Universe with apparent classicality. We report the first study of the dynamics of quantum Darwinism in a realistic model of decoherence, quantum Brownian motion. Prepared in a highly squeezed state--a macroscopic superposition--the system leaves records whose redundancy increases rapidly with initial delocalization.

View Article and Find Full Text PDF

We introduce a general operational characterization of information-preserving structures-encompassing noiseless subsystems, decoherence-free subspaces, pointer bases, and error-correcting codes-by demonstrating that they are isometric to fixed points of unital quantum processes. Using this, we show that every information-preserving structure is a matrix algebra. We further establish a structure theorem for the fixed states and observables of an arbitrary process, which unifies the Schrödinger and Heisenberg pictures, places restrictions on physically allowed kinds of information, and provides an efficient algorithm for finding all noiseless and unitarily noiseless subsystems of the process.

View Article and Find Full Text PDF

We present an efficient quantum algorithm to measure the average fidelity decay of a quantum map under perturbation using a single bit of quantum information. Our algorithm scales only as the complexity of the map under investigation. Thus for those maps admitting an efficient gate decomposition, it provides an exponential speedup over known classical procedures.

View Article and Find Full Text PDF