It is known that multiphoton states can be protected from decoherence due to a passive loss channel by applying noiseless attenuation before and noiseless amplification after the channel. In this work, we propose the combined use of multiphoton subtraction on four-component cat codes and teleamplification to effectively suppress errors under detection and environmental losses. The back-action from multiphoton subtraction modifies the encoded qubit encoded on cat states by suppressing the higher photon numbers, while simultaneously ensuring that the original qubit can be recovered effectively through teleamplification followed by error correction, thus preserving its quantum information.
View Article and Find Full Text PDFWe propose an all-linear-optical scheme to ballistically generate a cluster state for measurement-based topological fault-tolerant quantum computation using hybrid photonic qubits entangled in a continuous-discrete domain. Availability of near-deterministic Bell-state measurements on hybrid qubits is exploited for this purpose. In the presence of photon losses, we show that our scheme leads to a significant enhancement in both tolerable photon-loss rate and resource overheads.
View Article and Find Full Text PDFRecent quantum technologies utilize complex multidimensional processes that govern the dynamics of quantum systems. We develop an adaptive diagonal-element-probing compression technique that feasibly characterizes any unknown quantum processes using much fewer measurements compared to conventional methods. This technique utilizes compressive projective measurements that are generalizable to an arbitrary number of subsystems.
View Article and Find Full Text PDFStandard computation of size and credibility of a Bayesian credible region for certifying any point estimator of an unknown parameter (such as a quantum state, channel, phase, etc.) requires selecting points that are in the region from a finite parameter-space sample, which is infeasible for a large dataset or dimension as the region would then be extremely small. We solve this problem by introducing the in-region sampling theory to compute both region qualities just by sampling appropriate functions over the region itself using any Monte Carlo sampling method.
View Article and Find Full Text PDFIn continuous-variable tomography, with finite data and limited computation resources, reconstruction of a quantum state of light is performed on a finite-dimensional subspace. In principle, the data themselves encode all information about the relevant subspace that physically contains the state. We provide a straightforward and numerically feasible procedure to uniquely determine the appropriate reconstruction subspace by extracting this information directly from the data for any given unknown quantum state of light and measurement scheme.
View Article and Find Full Text PDFWe reveal that quadrature squeezing can result in significantly better quantum-estimation performance with quantum heterodyne detection (of H. P. Yuen and J.
View Article and Find Full Text PDFWe report an experiment in which one determines, with least tomographic effort, whether an unknown two-photon polarization state is entangled or separable. The method measures whole families of optimal entanglement witnesses. We introduce adaptive measurement schemes that greatly speed up the entanglement detection.
View Article and Find Full Text PDFQuantum-state reconstruction on a finite number of copies of a quantum system with informationally incomplete measurements, as a rule, does not yield a unique result. We derive a reconstruction scheme where both the likelihood and the von Neumann entropy functionals are maximized in order to systematically select the most-likely estimator with the largest entropy, that is, the least-bias estimator, consistent with a given set of measurement data. This is equivalent to the joint consideration of our partial knowledge and ignorance about the ensemble to reconstruct its identity.
View Article and Find Full Text PDF