We consider increasingly complex models of matrix denoising and dictionary learning in the Bayes-optimal setting, in the challenging regime where the matrices to infer have a rank growing linearly with the system size. This is in contrast with most existing literature concerned with the low-rank (i.e.
View Article and Find Full Text PDFWe propose the first correct special-purpose quantum circuits for preparation of Bell diagonal states (BDS), and implement them on the IBM Quantum computer, characterizing and testing complex aspects of their quantum correlations in the full parameter space. Among the circuits proposed, one involves only two quantum bits but requires adapted quantum tomography routines handling classical bits in parallel. The entire class of Bell diagonal states is generated, and several characteristic indicators, namely entanglement of formation and concurrence, CHSH non-locality, steering and discord, are experimentally evaluated over the full parameter space and compared with theory.
View Article and Find Full Text PDFProc Natl Acad Sci U S A
March 2019
Generalized linear models (GLMs) are used in high-dimensional machine learning, statistics, communications, and signal processing. In this paper we analyze GLMs when the data matrix is random, as relevant in problems such as compressed sensing, error-correcting codes, or benchmark models in neural networks. We evaluate the mutual information (or "free entropy") from which we deduce the Bayes-optimal estimation and generalization errors.
View Article and Find Full Text PDF