Sign Problem in Tensor-Network Contraction

We investigate how the computational difficulty of contracting tensor networks depends on the sign structure of the tensor entries. Using results from computational complexity, we observe that the approximate contraction of tensor networks with only positive entries has lower computational complexit...

Full description

Saved in:
Bibliographic Details
Main Authors: Jielun Chen, Jiaqing Jiang, Dominik Hangleiter, Norbert Schuch
Format: Article
Language:English
Published: American Physical Society 2025-01-01
Series:PRX Quantum
Online Access:http://doi.org/10.1103/PRXQuantum.6.010312
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850028810331750400
author Jielun Chen
Jiaqing Jiang
Dominik Hangleiter
Norbert Schuch
author_facet Jielun Chen
Jiaqing Jiang
Dominik Hangleiter
Norbert Schuch
author_sort Jielun Chen
collection DOAJ
description We investigate how the computational difficulty of contracting tensor networks depends on the sign structure of the tensor entries. Using results from computational complexity, we observe that the approximate contraction of tensor networks with only positive entries has lower computational complexity as compared to tensor networks with general real or complex entries. This raises the question of how this transition in computational complexity manifests itself in the hardness of different tensor-network-contraction schemes. We pursue this question by studying random tensor networks with varying bias toward positive entries. First, we consider contraction via Monte Carlo sampling and find that the transition from hard to easy occurs when the tensor entries become predominantly positive; this can be understood as a tensor-network manifestation of the well-known negative-sign problem in quantum Monte Carlo. Second, we analyze the commonly used contraction based on boundary tensor networks. The performance of this scheme is governed by the number of correlations in contiguous parts of the tensor network (which by analogy can be thought of as entanglement). Remarkably, we find that the transition from hard to easy—i.e., from a volume-law to a boundary-law scaling of entanglement—already occurs for a slight bias of the tensor entries toward a positive mean, scaling inversely with the bond dimension D, and thus the problem becomes easy the earlier the larger D occurs. This is in contrast both to expectations and to the behavior found in Monte Carlo contraction, where the hardness at fixed bias increases with the bond dimension. To provide insight into this early breakdown of computational hardness and the accompanying entanglement transition, we construct an effective classical statistical-mechanical model that predicts a transition at a bias of the tensor entries of 1/D, confirming our observations. We conclude by investigating the computational difficulty of computing expectation values of tensor-network wave functions (projected entangled-pair states, PEPSs) and find that in this setting, the complexity of entanglement-based contraction always remains low. We explain this by providing a local transformation that maps PEPS expectation values to a positive-valued tensor network. This not only provides insight into the origin of the observed boundary-law entanglement scaling but also suggests new approaches toward PEPS contraction based on positive decompositions.
format Article
id doaj-art-c4d6d483fc894fae8a3037415229929a
institution DOAJ
issn 2691-3399
language English
publishDate 2025-01-01
publisher American Physical Society
record_format Article
series PRX Quantum
spelling doaj-art-c4d6d483fc894fae8a3037415229929a2025-08-20T02:59:42ZengAmerican Physical SocietyPRX Quantum2691-33992025-01-016101031210.1103/PRXQuantum.6.010312Sign Problem in Tensor-Network ContractionJielun ChenJiaqing JiangDominik HangleiterNorbert SchuchWe investigate how the computational difficulty of contracting tensor networks depends on the sign structure of the tensor entries. Using results from computational complexity, we observe that the approximate contraction of tensor networks with only positive entries has lower computational complexity as compared to tensor networks with general real or complex entries. This raises the question of how this transition in computational complexity manifests itself in the hardness of different tensor-network-contraction schemes. We pursue this question by studying random tensor networks with varying bias toward positive entries. First, we consider contraction via Monte Carlo sampling and find that the transition from hard to easy occurs when the tensor entries become predominantly positive; this can be understood as a tensor-network manifestation of the well-known negative-sign problem in quantum Monte Carlo. Second, we analyze the commonly used contraction based on boundary tensor networks. The performance of this scheme is governed by the number of correlations in contiguous parts of the tensor network (which by analogy can be thought of as entanglement). Remarkably, we find that the transition from hard to easy—i.e., from a volume-law to a boundary-law scaling of entanglement—already occurs for a slight bias of the tensor entries toward a positive mean, scaling inversely with the bond dimension D, and thus the problem becomes easy the earlier the larger D occurs. This is in contrast both to expectations and to the behavior found in Monte Carlo contraction, where the hardness at fixed bias increases with the bond dimension. To provide insight into this early breakdown of computational hardness and the accompanying entanglement transition, we construct an effective classical statistical-mechanical model that predicts a transition at a bias of the tensor entries of 1/D, confirming our observations. We conclude by investigating the computational difficulty of computing expectation values of tensor-network wave functions (projected entangled-pair states, PEPSs) and find that in this setting, the complexity of entanglement-based contraction always remains low. We explain this by providing a local transformation that maps PEPS expectation values to a positive-valued tensor network. This not only provides insight into the origin of the observed boundary-law entanglement scaling but also suggests new approaches toward PEPS contraction based on positive decompositions.http://doi.org/10.1103/PRXQuantum.6.010312
spellingShingle Jielun Chen
Jiaqing Jiang
Dominik Hangleiter
Norbert Schuch
Sign Problem in Tensor-Network Contraction
PRX Quantum
title Sign Problem in Tensor-Network Contraction
title_full Sign Problem in Tensor-Network Contraction
title_fullStr Sign Problem in Tensor-Network Contraction
title_full_unstemmed Sign Problem in Tensor-Network Contraction
title_short Sign Problem in Tensor-Network Contraction
title_sort sign problem in tensor network contraction
url http://doi.org/10.1103/PRXQuantum.6.010312
work_keys_str_mv AT jielunchen signproblemintensornetworkcontraction
AT jiaqingjiang signproblemintensornetworkcontraction
AT dominikhangleiter signproblemintensornetworkcontraction
AT norbertschuch signproblemintensornetworkcontraction