Cons-training tensor networks: Embedding and optimization over discrete linear constraints

In this study, we introduce a novel family of tensor networks, termed constrained matrix product states (MPS), designed to incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures. These tensor networks are particularly tailored for modeling dis...

Full description

Saved in:
Bibliographic Details
Main Author: Javier Lopez-Piqueres, Jing Chen
Format: Article
Language:English
Published: SciPost 2025-06-01
Series:SciPost Physics
Online Access:https://scipost.org/SciPostPhys.18.6.192
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850100688700309504
author Javier Lopez-Piqueres, Jing Chen
author_facet Javier Lopez-Piqueres, Jing Chen
author_sort Javier Lopez-Piqueres, Jing Chen
collection DOAJ
description In this study, we introduce a novel family of tensor networks, termed constrained matrix product states (MPS), designed to incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures. These tensor networks are particularly tailored for modeling distributions with support strictly over the feasible space, offering benefits such as reducing the search space in optimization problems, alleviating overfitting, improving training efficiency, and decreasing model size. Central to our approach is the concept of a quantum region, an extension of quantum numbers traditionally used in $U(1)$ symmetric tensor networks, adapted to capture any linear constraint, including the unconstrained scenario. We further develop a novel canonical form for these new MPS, which allow for the merging and factorization of tensor blocks according to quantum region fusion rules and permit optimal truncation schemes. Utilizing this canonical form, we apply an unsupervised training strategy to optimize arbitrary objective functions subject to discrete linear constraints. Our method's efficacy is demonstrated by solving the quadratic knapsack problem, achieving superior performance compared to a leading nonlinear integer programming solver. Additionally, we analyze the complexity and scalability of our approach, demonstrating its potential in addressing complex constrained combinatorial optimization problems.
format Article
id doaj-art-63aa1361e33a4cd28b54cc33cf81664f
institution DOAJ
issn 2542-4653
language English
publishDate 2025-06-01
publisher SciPost
record_format Article
series SciPost Physics
spelling doaj-art-63aa1361e33a4cd28b54cc33cf81664f2025-08-20T02:40:14ZengSciPostSciPost Physics2542-46532025-06-0118619210.21468/SciPostPhys.18.6.192Cons-training tensor networks: Embedding and optimization over discrete linear constraintsJavier Lopez-Piqueres, Jing ChenIn this study, we introduce a novel family of tensor networks, termed constrained matrix product states (MPS), designed to incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures. These tensor networks are particularly tailored for modeling distributions with support strictly over the feasible space, offering benefits such as reducing the search space in optimization problems, alleviating overfitting, improving training efficiency, and decreasing model size. Central to our approach is the concept of a quantum region, an extension of quantum numbers traditionally used in $U(1)$ symmetric tensor networks, adapted to capture any linear constraint, including the unconstrained scenario. We further develop a novel canonical form for these new MPS, which allow for the merging and factorization of tensor blocks according to quantum region fusion rules and permit optimal truncation schemes. Utilizing this canonical form, we apply an unsupervised training strategy to optimize arbitrary objective functions subject to discrete linear constraints. Our method's efficacy is demonstrated by solving the quadratic knapsack problem, achieving superior performance compared to a leading nonlinear integer programming solver. Additionally, we analyze the complexity and scalability of our approach, demonstrating its potential in addressing complex constrained combinatorial optimization problems.https://scipost.org/SciPostPhys.18.6.192
spellingShingle Javier Lopez-Piqueres, Jing Chen
Cons-training tensor networks: Embedding and optimization over discrete linear constraints
SciPost Physics
title Cons-training tensor networks: Embedding and optimization over discrete linear constraints
title_full Cons-training tensor networks: Embedding and optimization over discrete linear constraints
title_fullStr Cons-training tensor networks: Embedding and optimization over discrete linear constraints
title_full_unstemmed Cons-training tensor networks: Embedding and optimization over discrete linear constraints
title_short Cons-training tensor networks: Embedding and optimization over discrete linear constraints
title_sort cons training tensor networks embedding and optimization over discrete linear constraints
url https://scipost.org/SciPostPhys.18.6.192
work_keys_str_mv AT javierlopezpiqueresjingchen constrainingtensornetworksembeddingandoptimizationoverdiscretelinearconstraints