We introduce a change of perspective on tensor network states that is defined by the computational graph of the contraction of an amplitude. The resulting class of states, which we refer to as tensor network functions, inherit the conceptual advantages of tensor network states while removing computational restrictions arising from the need to converge approximate contractions. We use tensor network functions to compute strict variational estimates of the energy on loopy graphs, analyze their expressive power for ground states, show that we can capture aspects of volume law time evolution, and provide a mapping of general feed-forward neural nets onto efficient tensor network functions. Our work expands the realm of computable tensor networks to ones where accurate contraction methods are not available, and opens up new avenues to use tensor networks.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1103/PhysRevLett.133.260404 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!