Publications by authors named "Mile Gu"

Decoupling systems into independently evolving components has a long history of simplifying seemingly complex systems. They enable a better understanding of the underlying dynamics and causal structures while providing more efficient means to simulate such processes on a computer. Here we outline a variational decoupling algorithm for decoupling unitary quantum dynamics-allowing us to decompose a given n-qubit unitary gate into multiple independently evolving sub-components.

View Article and Find Full Text PDF
Article Synopsis
  • Light carries complex information, and traditional methods for analyzing this data require multiple specific optical components, complicating detection systems.
  • This study introduces a metasurface-assisted graphene photodetector that can simultaneously identify different polarization states and wavelengths of light (1-8 μm) with high accuracy (0.5 μm).
  • Using advanced techniques like cooperative multiport metasurfaces and machine learning, the new device allows for effective separation of polarization and wavelength information, paving the way for compact and efficient spectral-polarization co-detection.
View Article and Find Full Text PDF

From correlations in measurement outcomes alone, can two otherwise isolated parties establish whether such correlations are atemporal? That is, can they rule out that they have been given the same system at two different times? Classical statistics says no, yet quantum theory disagrees. Here, we introduce the necessary and sufficient conditions by which such quantum correlations can be identified as atemporal. We demonstrate the asymmetry of atemporality under time reversal and reveal it to be a measure of spatial quantum correlation distinct from entanglement.

View Article and Find Full Text PDF

In covert target detection, Alice attempts to send optical or microwave probes to determine the presence or absence of a weakly reflecting target embedded in thermal background radiation within a target region, while striving to remain undetected by an adversary, Willie, who is co-located with the target and collects all light that does not return to Alice. We formulate this problem in a realistic setting and derive quantum-mechanical limits on Alice's error probability performance in entanglement-assisted target detection for any fixed level of her detectability by Willie. We demonstrate how Alice can approach this performance limit using two-mode squeezed vacuum probes in the regime of small to moderate background brightness, and how such protocols can outperform any conventional approach using Gaussian-distributed coherent states.

View Article and Find Full Text PDF

Distillation, or purification, is central to the practical use of quantum resources in noisy settings often encountered in quantum communication and computation. Conventionally, distillation requires using some restricted "free" operations to convert a noisy state into one that approximates a desired pure state. Here, we propose to relax this setting by only requiring the approximation of the measurement statistics of a target pure state, which allows for additional classical postprocessing of the measurement outcomes.

View Article and Find Full Text PDF

Numerous quantum error-mitigation protocols have been proposed, motivated by the critical need to suppress noise effects on intermediate-scale quantum devices. Yet, their general potential and limitations remain elusive. In particular, to understand the ultimate feasibility of quantum error mitigation, it is crucial to characterize the fundamental sampling cost-how many times an arbitrary mitigation protocol must run a noisy quantum device.

View Article and Find Full Text PDF

Heisenberg's uncertainty principle implies fundamental constraints on what properties of a quantum system we can simultaneously learn. However, it typically assumes that we probe these properties via measurements at a single point in time. In contrast, inferring causal dependencies in complex processes often requires interactive experimentation-multiple rounds of interventions where we adaptively probe the process with different inputs to observe how they affect outputs.

View Article and Find Full Text PDF
Article Synopsis
  • Complex systems, like weather patterns or ecosystems, affect our daily lives and can be hard to predict.
  • Stochastic modelling helps scientists understand these systems by predicting how they will behave based on past events.
  • Researchers used quantum technology to create more efficient models that need less memory and can make better predictions than traditional models!
View Article and Find Full Text PDF

Phase-insensitive optical amplifiers uniformly amplify each quadrature of an input field and are of both fundamental and technological importance. We find the quantum limit on the precision of estimating the gain of a quantum-limited phase-insensitive amplifier using a multimode probe that may also be entangled with an ancilla system. In stark contrast to the sensing of loss parameters, the average photon number N and number of input modes M of the probe are found to be equivalent and interchangeable resources for optimal gain sensing.

View Article and Find Full Text PDF

Exact results about the nonequilibrium thermodynamics of open quantum systems at arbitrary timescales are obtained by considering all possible variations of initial conditions of a system. First we obtain a quantum-information theoretic equality for entropy production, valid for an arbitrary initial joint state of system and environment. For any finite-time process with a fixed initial environment, we then show that the system's loss of distinction-relative to the minimally dissipative state-exactly quantifies its thermodynamic dissipation.

View Article and Find Full Text PDF

Effective and efficient forecasting relies on identification of the relevant information contained in past observations-the predictive features-and isolating it from the rest. When the future of a process bears a strong dependence on its behavior far into the past, there are many such features to store, necessitating complex models with extensive memories. Here, we highlight a family of stochastic processes whose minimal classical models must devote unboundedly many bits to tracking the past.

View Article and Find Full Text PDF

Realizing a long coherence time quantum memory is a major challenge of current quantum technology. Until now, the longest coherence-time of a single qubit was reported as 660 s in a single Yb ion-qubit through the technical developments of sympathetic cooling and dynamical decoupling pulses, which addressed heating-induced detection inefficiency and magnetic field fluctuations. However, it was not clear what prohibited further enhancement.

View Article and Find Full Text PDF

Quantifying how distinguishable two stochastic processes are is at the heart of many fields, such as machine learning and quantitative finance. While several measures have been proposed for this task, none have universal applicability and ease of use. In this article, we suggest a set of requirements for a well-behaved measure of process distinguishability.

View Article and Find Full Text PDF

Deterministic quantum computation with one qubit (DQC1) is iconic in highlighting that exponential quantum speedup may be achieved with negligible entanglement. Its discovery catalyzed a heated study of general quantum resources, and various conjectures regarding their role in DQC1's performance advantage. Coherence and discord are prominent candidates, respectively, characterizing nonclassicality within localized and correlated systems.

View Article and Find Full Text PDF

Modern computation relies crucially on modular architectures, breaking a complex algorithm into self-contained subroutines. A client can then call upon a remote server to implement parts of the computation independently via an application programming interface (API). Present APIs relay only classical information.

View Article and Find Full Text PDF

Simulations of stochastic processes play an important role in the quantitative sciences, enabling the characterisation of complex systems. Recent work has established a quantum advantage in stochastic simulation, leading to quantum devices that execute a simulation using less memory than possible by classical means. To realise this advantage it is essential that the memory register remains coherent, and coherently interacts with the processor, allowing the simulator to operate over many time steps.

View Article and Find Full Text PDF

The information-carrying capacity of a memory is known to be a thermodynamic resource facilitating the conversion of heat to work. Szilard's engine explicates this connection through a toy example involving an energy-degenerate two-state memory. We devise a formalism to quantify the thermodynamic value of memory in general quantum systems with nontrivial energy landscapes.

View Article and Find Full Text PDF

In stochastic modeling, there has been a significant effort towards finding predictive models that predict a stochastic process' future using minimal information from its past. Meanwhile, in condensed matter physics, matrix product states (MPS) are known as a particularly efficient representation of 1D spin chains. In this Letter, we associate each stochastic process with a suitable quantum state of a spin chain.

View Article and Find Full Text PDF

Quantum resource theories seek to quantify sources of nonclassicality that bestow quantum technologies their operational advantage. Chief among these are studies of quantum correlations and quantum coherence. The former isolates nonclassicality in the correlations between systems, and the latter captures nonclassicality of quantum superpositions within a single physical system.

View Article and Find Full Text PDF

The NOT gate that flips a classical bit is ubiquitous in classical information processing. However its quantum analogue, the universal NOT (UNOT) gate that flips a quantum spin in any alignment into its antipodal counterpart is strictly forbidden. Here we explore the connection between this discrepancy and how UNOT gates affect classical and quantum correlations.

View Article and Find Full Text PDF

Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models.

View Article and Find Full Text PDF

Simulating quantum contextuality with classical systems requires memory. A fundamental yet open question is what is the minimum memory needed and, therefore, the precise sense in which quantum systems outperform classical ones. Here, we make rigorous the notion of classically simulating quantum state-independent contextuality (QSIC) in the case of a single quantum system submitted to an infinite sequence of measurements randomly chosen from a finite QSIC set.

View Article and Find Full Text PDF

Many organisms capitalize on their ability to predict the environment to maximize available free energy and reinvest this energy to create new complex structures. This functionality relies on the manipulation of patterns-temporally ordered sequences of data. Here, we propose a framework to describe pattern manipulators-devices that convert thermodynamic work to patterns or vice versa-and use them to build a "pattern engine" that facilitates a thermodynamic cycle of pattern creation and consumption.

View Article and Find Full Text PDF

Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory.

View Article and Find Full Text PDF