Relative information spectra with applications to statistical inference

For any pair of probability measures defined on a common space, their relative information spectra——specifically, the distribution functions of the loglikelihood ratio under either probability measure——fully encapsulate all that is relevant for distinguishing them. This paper explores the properties...

Full description

Saved in:
Bibliographic Details
Main Author: Sergio Verdú
Format: Article
Language:English
Published: AIMS Press 2024-12-01
Series:AIMS Mathematics
Subjects:
Online Access:https://www.aimspress.com/article/doi/10.3934/math.20241668
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832590742239838208
author Sergio Verdú
author_facet Sergio Verdú
author_sort Sergio Verdú
collection DOAJ
description For any pair of probability measures defined on a common space, their relative information spectra——specifically, the distribution functions of the loglikelihood ratio under either probability measure——fully encapsulate all that is relevant for distinguishing them. This paper explores the properties of the relative information spectra and their connections to various measures of discrepancy including total variation distance, relative entropy, Rényi divergence, and general $ f $-divergences. A simple definition of sufficient statistics, termed $ I $-sufficiency, is introduced and shown to coincide with longstanding notions under the assumptions that the data model is dominated and the observation space is standard. Additionally, a new measure of discrepancy between probability measures, the NP-divergence, is proposed and shown to determine the area of the error probability pairs achieved by the Neyman-Pearson binary hypothesis tests. For independent identically distributed data models, that area is shown to approach 1 at a rate governed by the Bhattacharyya distance.
format Article
id doaj-art-4e09045318f748ac9cb6a0967f906111
institution Kabale University
issn 2473-6988
language English
publishDate 2024-12-01
publisher AIMS Press
record_format Article
series AIMS Mathematics
spelling doaj-art-4e09045318f748ac9cb6a0967f9061112025-01-23T07:53:25ZengAIMS PressAIMS Mathematics2473-69882024-12-01912350383509010.3934/math.20241668Relative information spectra with applications to statistical inferenceSergio Verdú0Independent researcher, Princeton, NJ 08540, USAFor any pair of probability measures defined on a common space, their relative information spectra——specifically, the distribution functions of the loglikelihood ratio under either probability measure——fully encapsulate all that is relevant for distinguishing them. This paper explores the properties of the relative information spectra and their connections to various measures of discrepancy including total variation distance, relative entropy, Rényi divergence, and general $ f $-divergences. A simple definition of sufficient statistics, termed $ I $-sufficiency, is introduced and shown to coincide with longstanding notions under the assumptions that the data model is dominated and the observation space is standard. Additionally, a new measure of discrepancy between probability measures, the NP-divergence, is proposed and shown to determine the area of the error probability pairs achieved by the Neyman-Pearson binary hypothesis tests. For independent identically distributed data models, that area is shown to approach 1 at a rate governed by the Bhattacharyya distance.https://www.aimspress.com/article/doi/10.3934/math.20241668information theorystatistical inferencesufficient statisticshypothesis testingneyman-pearson testsinformation spectrum methodrelative entropykullback-leibler divergence$ f $-divergencetotal variation distancebhattacharyya distance
spellingShingle Sergio Verdú
Relative information spectra with applications to statistical inference
AIMS Mathematics
information theory
statistical inference
sufficient statistics
hypothesis testing
neyman-pearson tests
information spectrum method
relative entropy
kullback-leibler divergence
$ f $-divergence
total variation distance
bhattacharyya distance
title Relative information spectra with applications to statistical inference
title_full Relative information spectra with applications to statistical inference
title_fullStr Relative information spectra with applications to statistical inference
title_full_unstemmed Relative information spectra with applications to statistical inference
title_short Relative information spectra with applications to statistical inference
title_sort relative information spectra with applications to statistical inference
topic information theory
statistical inference
sufficient statistics
hypothesis testing
neyman-pearson tests
information spectrum method
relative entropy
kullback-leibler divergence
$ f $-divergence
total variation distance
bhattacharyya distance
url https://www.aimspress.com/article/doi/10.3934/math.20241668
work_keys_str_mv AT sergioverdu relativeinformationspectrawithapplicationstostatisticalinference