A measure of mutual divergence among a number of probability distributions
The principle of optimality of dynamic programming is used to prove three major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more pr...
Saved in:
| Main Authors: | J. N. Kapur, Vinod Kumar, Uma Kumar |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
1987-01-01
|
| Series: | International Journal of Mathematics and Mathematical Sciences |
| Subjects: | |
| Online Access: | http://dx.doi.org/10.1155/S016117128700070X |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Bounds on the Excess Minimum Risk via Generalized Information Divergence Measures
by: Ananya Omanwar, et al.
Published: (2025-07-01) -
Properties of Shannon and Rényi entropies of the Poisson distribution as the functions of intensity parameter
by: Volodymyr Braiman, et al.
Published: (2024-07-01) -
An information theoretic limit to data amplification
by: S J Watts, et al.
Published: (2025-01-01) -
Refining Jensen–Mercer inequality and its applications in probability and statistics
by: Rabia Bibi, et al.
Published: (2025-07-01) -
Prediction of the Time Distribution of Shannon and Renyi Entropy Based on the Theory of Moments
by: Iehor V. Sedliarov, et al.
Published: (2025-02-01)