On the connection between least squares, regularization, and classical shadows

Classical shadows (CS) offer a resource-efficient means to estimate quantum observables, circumventing the need for exhaustive state tomography. Here, we clarify and explore the connection between CS techniques and least squares (LS) and regularized least squares (RLS) methods commonly used in machi...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhihui Zhu, Joseph M. Lukens, Brian T. Kirby
Format: Article
Language:English
Published: Verein zur Förderung des Open Access Publizierens in den Quantenwissenschaften 2024-08-01
Series:Quantum
Online Access:https://quantum-journal.org/papers/q-2024-08-29-1455/pdf/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850123725142228992
author Zhihui Zhu
Joseph M. Lukens
Brian T. Kirby
author_facet Zhihui Zhu
Joseph M. Lukens
Brian T. Kirby
author_sort Zhihui Zhu
collection DOAJ
description Classical shadows (CS) offer a resource-efficient means to estimate quantum observables, circumventing the need for exhaustive state tomography. Here, we clarify and explore the connection between CS techniques and least squares (LS) and regularized least squares (RLS) methods commonly used in machine learning and data analysis. By formal identification of LS and RLS ``shadows'' completely analogous to those in CS---namely, point estimators calculated from the empirical frequencies of single measurements---we show that both RLS and CS can be viewed as regularizers for the underdetermined regime, replacing the pseudoinverse with invertible alternatives. Through numerical simulations, we evaluate RLS and CS from three distinct angles: the tradeoff in bias and variance, mismatch between the expected and actual measurement distributions, and the interplay between the number of measurements and number of shots per measurement. Compared to CS, RLS attains lower variance at the expense of bias, is robust to distribution mismatch, and is more sensitive to the number of shots for a fixed number of state copies---differences that can be understood from the distinct approaches taken to regularization. Conceptually, our integration of LS, RLS, and CS under a unifying ``shadow'' umbrella aids in advancing the overall picture of CS techniques, while practically our results highlight the tradeoffs intrinsic to these measurement approaches, illuminating the circumstances under which either RLS or CS would be preferred, such as unverified randomness for the former or unbiased estimation for the latter.
format Article
id doaj-art-c871fc7dca05455bad0368df83240032
institution OA Journals
issn 2521-327X
language English
publishDate 2024-08-01
publisher Verein zur Förderung des Open Access Publizierens in den Quantenwissenschaften
record_format Article
series Quantum
spelling doaj-art-c871fc7dca05455bad0368df832400322025-08-20T02:34:32ZengVerein zur Förderung des Open Access Publizierens in den QuantenwissenschaftenQuantum2521-327X2024-08-018145510.22331/q-2024-08-29-145510.22331/q-2024-08-29-1455On the connection between least squares, regularization, and classical shadowsZhihui ZhuJoseph M. LukensBrian T. KirbyClassical shadows (CS) offer a resource-efficient means to estimate quantum observables, circumventing the need for exhaustive state tomography. Here, we clarify and explore the connection between CS techniques and least squares (LS) and regularized least squares (RLS) methods commonly used in machine learning and data analysis. By formal identification of LS and RLS ``shadows'' completely analogous to those in CS---namely, point estimators calculated from the empirical frequencies of single measurements---we show that both RLS and CS can be viewed as regularizers for the underdetermined regime, replacing the pseudoinverse with invertible alternatives. Through numerical simulations, we evaluate RLS and CS from three distinct angles: the tradeoff in bias and variance, mismatch between the expected and actual measurement distributions, and the interplay between the number of measurements and number of shots per measurement. Compared to CS, RLS attains lower variance at the expense of bias, is robust to distribution mismatch, and is more sensitive to the number of shots for a fixed number of state copies---differences that can be understood from the distinct approaches taken to regularization. Conceptually, our integration of LS, RLS, and CS under a unifying ``shadow'' umbrella aids in advancing the overall picture of CS techniques, while practically our results highlight the tradeoffs intrinsic to these measurement approaches, illuminating the circumstances under which either RLS or CS would be preferred, such as unverified randomness for the former or unbiased estimation for the latter.https://quantum-journal.org/papers/q-2024-08-29-1455/pdf/
spellingShingle Zhihui Zhu
Joseph M. Lukens
Brian T. Kirby
On the connection between least squares, regularization, and classical shadows
Quantum
title On the connection between least squares, regularization, and classical shadows
title_full On the connection between least squares, regularization, and classical shadows
title_fullStr On the connection between least squares, regularization, and classical shadows
title_full_unstemmed On the connection between least squares, regularization, and classical shadows
title_short On the connection between least squares, regularization, and classical shadows
title_sort on the connection between least squares regularization and classical shadows
url https://quantum-journal.org/papers/q-2024-08-29-1455/pdf/
work_keys_str_mv AT zhihuizhu ontheconnectionbetweenleastsquaresregularizationandclassicalshadows
AT josephmlukens ontheconnectionbetweenleastsquaresregularizationandclassicalshadows
AT briantkirby ontheconnectionbetweenleastsquaresregularizationandclassicalshadows