Kolmogorov Capacity with Overlap

The notion of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between non-stochastic uncertain variables is introduced as...

Full description

Saved in:
Bibliographic Details
Main Authors: Anshuka Rangi, Massimo Franceschetti
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/27/5/472
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849327525316001792
author Anshuka Rangi
Massimo Franceschetti
author_facet Anshuka Rangi
Massimo Franceschetti
author_sort Anshuka Rangi
collection DOAJ
description The notion of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between non-stochastic uncertain variables is introduced as a generalization of Nair’s non-stochastic information functional. Several properties of this new quantity are illustrated and used in a communication setting to show that the largest <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between received and transmitted codewords over <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-noise channels equals the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula>-capacity. This notion of capacity generalizes the Kolmogorov <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-capacity to packing sets of overlap at most <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula> and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, including non-stochastic, memoryless, and stationary channels. The presented theory admits the possibility of decoding errors, as in classical information theory, while retaining the worst-case, non-stochastic character of Kolmogorov’s approach.
format Article
id doaj-art-f33dd4e556e34b18be3c1fcbfdbe353d
institution Kabale University
issn 1099-4300
language English
publishDate 2025-04-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj-art-f33dd4e556e34b18be3c1fcbfdbe353d2025-08-20T03:47:52ZengMDPI AGEntropy1099-43002025-04-0127547210.3390/e27050472Kolmogorov Capacity with OverlapAnshuka Rangi0Massimo Franceschetti1Department of Electrical and Computer Engineering, University of California at San Diego, 9500 Gilman Drive, Mail Code 0407, La Jolla, CA 92093-0407, USADepartment of Electrical and Computer Engineering, University of California at San Diego, 9500 Gilman Drive, Mail Code 0407, La Jolla, CA 92093-0407, USAThe notion of <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between non-stochastic uncertain variables is introduced as a generalization of Nair’s non-stochastic information functional. Several properties of this new quantity are illustrated and used in a communication setting to show that the largest <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula>-mutual information between received and transmitted codewords over <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-noise channels equals the <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>(</mo><mi>ϵ</mi><mo>,</mo><mi>δ</mi><mo>)</mo></mrow></semantics></math></inline-formula>-capacity. This notion of capacity generalizes the Kolmogorov <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>ϵ</mi></semantics></math></inline-formula>-capacity to packing sets of overlap at most <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mi>δ</mi></semantics></math></inline-formula> and is a variation of a previous definition proposed by one of the authors. Results are then extended to more general noise models, including non-stochastic, memoryless, and stationary channels. The presented theory admits the possibility of decoding errors, as in classical information theory, while retaining the worst-case, non-stochastic character of Kolmogorov’s approach.https://www.mdpi.com/1099-4300/27/5/472<i>ϵ</i>-capacity(<i>ϵ</i>,<i>δ</i>)-capacitymutual informationnon-stochastic uncertainty
spellingShingle Anshuka Rangi
Massimo Franceschetti
Kolmogorov Capacity with Overlap
Entropy
<i>ϵ</i>-capacity
(<i>ϵ</i>,<i>δ</i>)-capacity
mutual information
non-stochastic uncertainty
title Kolmogorov Capacity with Overlap
title_full Kolmogorov Capacity with Overlap
title_fullStr Kolmogorov Capacity with Overlap
title_full_unstemmed Kolmogorov Capacity with Overlap
title_short Kolmogorov Capacity with Overlap
title_sort kolmogorov capacity with overlap
topic <i>ϵ</i>-capacity
(<i>ϵ</i>,<i>δ</i>)-capacity
mutual information
non-stochastic uncertainty
url https://www.mdpi.com/1099-4300/27/5/472
work_keys_str_mv AT anshukarangi kolmogorovcapacitywithoverlap
AT massimofranceschetti kolmogorovcapacitywithoverlap