Publications by authors named "En-Jui Kuo"

The recovery of an unknown density matrix of large size requires huge computational resources. State-of-the-art performance has recently been achieved with the factored gradient descent (FGD) algorithm and its variants since they are able to mitigate the dimensionality barrier by utilizing some of the underlying structures of the density matrix. Despite the theoretical guarantee of a linear convergence rate, convergence in practical scenarios is still slow because the contracting factor of the FGD algorithms depends on the condition number κ of the ground truth state.

View Article and Find Full Text PDF

Recurrent neural networks have seen widespread use in modeling dynamical systems in varied domains such as weather prediction, text prediction and several others. Often one wishes to supplement the experimentally observed dynamics with prior knowledge or intuition about the system. While the recurrent nature of these networks allows them to model arbitrarily long memories in the time series used in training, it makes it harder to impose prior knowledge or intuition through generic constraints.

View Article and Find Full Text PDF

Recurrent neural networks have led to breakthroughs in natural language processing and speech recognition. Here we show that recurrent networks, specifically long short-term memory networks can also capture the temporal evolution of chemical/biophysical trajectories. Our character-level language model learns a probabilistic model of 1-dimensional stochastic trajectories generated from higher-dimensional dynamics.

View Article and Find Full Text PDF