Generic increase of observational entropy in isolated systems
Observational entropy—a quantity that unifies Boltzmann's entropy, Gibbs' entropy, von Neumann's macroscopic entropy, and the diagonal entropy—was recently argued to play a key role in a modern formulation of statistical mechanics. Here, relying on algebraic techniques taken from Petz...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
American Physical Society
2024-12-01
|
Series: | Physical Review Research |
Online Access: | http://doi.org/10.1103/PhysRevResearch.6.043327 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Observational entropy—a quantity that unifies Boltzmann's entropy, Gibbs' entropy, von Neumann's macroscopic entropy, and the diagonal entropy—was recently argued to play a key role in a modern formulation of statistical mechanics. Here, relying on algebraic techniques taken from Petz's theory of statistical sufficiency and on a Lévy-type concentration bound, we prove rigorous theorems showing how the observational entropy of a system undergoing a unitary evolution chosen at random tends to increase with overwhelming probability and to reach its maximum very quickly. More precisely, we show that for any observation that is sufficiently coarse with respect to the size of the system, regardless of the initial state of the system (be it pure or mixed), random evolution renders its state practically indistinguishable from the uniform (i.e., maximally mixed) distribution with a probability approaching 1 as the size of the system grows. The same conclusion holds not only for random evolutions sampled according to the unitarily invariant Haar distribution but also for approximate 2-designs, which are thought to provide a more physically and computationally reasonable model of random evolutions. |
---|---|
ISSN: | 2643-1564 |